[ 574.682960] env[67843]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 575.299552] env[67893]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 576.639468] env[67893]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=67893) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 576.639803] env[67893]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=67893) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 576.640293] env[67893]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=67893) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 576.640293] env[67893]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 576.838893] env[67893]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=67893) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} [ 576.849036] env[67893]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.010s {{(pid=67893) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} [ 576.954752] env[67893]: INFO nova.virt.driver [None req-634e0071-d910-41f5-996d-aa78e289185d None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 577.026830] env[67893]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 577.026993] env[67893]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 577.027107] env[67893]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=67893) __init__ /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:242}} [ 579.896548] env[67893]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-31957e1f-6ad3-4c18-b980-c31498e2b2f3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 579.912991] env[67893]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=67893) _create_session /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:242}} [ 579.913180] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-c8e751b1-548c-4785-9395-9dd34bd8991b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 579.937521] env[67893]: INFO oslo_vmware.api [-] Successfully established new session; session ID is 3ba39. [ 579.937673] env[67893]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 2.911s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 579.938223] env[67893]: INFO nova.virt.vmwareapi.driver [None req-634e0071-d910-41f5-996d-aa78e289185d None None] VMware vCenter version: 7.0.3 [ 579.941581] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c42c65f8-964e-4981-9fca-4b4f1bd7adf7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 579.958954] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8826dac1-ea92-4c6e-b72b-920f193a09e0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 579.964854] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d1aa5bf-919e-41bd-bfac-41755db2107a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 579.971287] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b74e228d-c1dd-40c0-94d9-661e51909f60 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 579.984098] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ddf8cfe-6051-4dd4-acb3-c9a7e785622c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 579.989866] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96fadc58-a464-4534-a7c5-d16edc2a344f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.019942] env[67893]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-313c810e-4bad-451b-8a57-715ab69067e1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.024760] env[67893]: DEBUG nova.virt.vmwareapi.driver [None req-634e0071-d910-41f5-996d-aa78e289185d None None] Extension org.openstack.compute already exists. {{(pid=67893) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:224}} [ 580.027352] env[67893]: INFO nova.compute.provider_config [None req-634e0071-d910-41f5-996d-aa78e289185d None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 580.045341] env[67893]: DEBUG nova.context [None req-634e0071-d910-41f5-996d-aa78e289185d None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),86906ac0-6db3-4999-adb3-2ceb177be793(cell1) {{(pid=67893) load_cells /opt/stack/nova/nova/context.py:464}} [ 580.047238] env[67893]: DEBUG oslo_concurrency.lockutils [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 580.047461] env[67893]: DEBUG oslo_concurrency.lockutils [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 580.048414] env[67893]: DEBUG oslo_concurrency.lockutils [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 580.048857] env[67893]: DEBUG oslo_concurrency.lockutils [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] Acquiring lock "86906ac0-6db3-4999-adb3-2ceb177be793" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 580.049065] env[67893]: DEBUG oslo_concurrency.lockutils [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] Lock "86906ac0-6db3-4999-adb3-2ceb177be793" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 580.050013] env[67893]: DEBUG oslo_concurrency.lockutils [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] Lock "86906ac0-6db3-4999-adb3-2ceb177be793" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 580.074425] env[67893]: INFO dbcounter [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] Registered counter for database nova_cell0 [ 580.082819] env[67893]: INFO dbcounter [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] Registered counter for database nova_cell1 [ 580.085596] env[67893]: DEBUG oslo_db.sqlalchemy.engines [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=67893) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 580.085945] env[67893]: DEBUG oslo_db.sqlalchemy.engines [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=67893) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 580.090513] env[67893]: DEBUG dbcounter [-] [67893] Writer thread running {{(pid=67893) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 580.091220] env[67893]: DEBUG dbcounter [-] [67893] Writer thread running {{(pid=67893) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 580.093671] env[67893]: ERROR nova.db.main.api [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 580.093671] env[67893]: result = function(*args, **kwargs) [ 580.093671] env[67893]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 580.093671] env[67893]: return func(*args, **kwargs) [ 580.093671] env[67893]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 580.093671] env[67893]: result = fn(*args, **kwargs) [ 580.093671] env[67893]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 580.093671] env[67893]: return f(*args, **kwargs) [ 580.093671] env[67893]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 580.093671] env[67893]: return db.service_get_minimum_version(context, binaries) [ 580.093671] env[67893]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 580.093671] env[67893]: _check_db_access() [ 580.093671] env[67893]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 580.093671] env[67893]: stacktrace = ''.join(traceback.format_stack()) [ 580.093671] env[67893]: [ 580.094681] env[67893]: ERROR nova.db.main.api [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 580.094681] env[67893]: result = function(*args, **kwargs) [ 580.094681] env[67893]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 580.094681] env[67893]: return func(*args, **kwargs) [ 580.094681] env[67893]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 580.094681] env[67893]: result = fn(*args, **kwargs) [ 580.094681] env[67893]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 580.094681] env[67893]: return f(*args, **kwargs) [ 580.094681] env[67893]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 580.094681] env[67893]: return db.service_get_minimum_version(context, binaries) [ 580.094681] env[67893]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 580.094681] env[67893]: _check_db_access() [ 580.094681] env[67893]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 580.094681] env[67893]: stacktrace = ''.join(traceback.format_stack()) [ 580.094681] env[67893]: [ 580.095280] env[67893]: WARNING nova.objects.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 580.095280] env[67893]: WARNING nova.objects.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] Failed to get minimum service version for cell 86906ac0-6db3-4999-adb3-2ceb177be793 [ 580.095605] env[67893]: DEBUG oslo_concurrency.lockutils [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] Acquiring lock "singleton_lock" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 580.095761] env[67893]: DEBUG oslo_concurrency.lockutils [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] Acquired lock "singleton_lock" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 580.096007] env[67893]: DEBUG oslo_concurrency.lockutils [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] Releasing lock "singleton_lock" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 580.096324] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] Full set of CONF: {{(pid=67893) _wait_for_exit_or_signal /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/service.py:362}} [ 580.096465] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ******************************************************************************** {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2589}} [ 580.096591] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] Configuration options gathered from: {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2590}} [ 580.096723] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2591}} [ 580.096908] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2592}} [ 580.097043] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ================================================================================ {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2594}} [ 580.097254] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] allow_resize_to_same_host = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.097420] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] arq_binding_timeout = 300 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.097550] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] backdoor_port = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.097692] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] backdoor_socket = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.097874] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] block_device_allocate_retries = 60 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.098054] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] block_device_allocate_retries_interval = 3 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.098228] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cert = self.pem {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.098396] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.098565] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] compute_monitors = [] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.098732] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] config_dir = [] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.098924] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] config_drive_format = iso9660 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.099074] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.099244] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] config_source = [] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.099410] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] console_host = devstack {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.099572] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] control_exchange = nova {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.099732] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cpu_allocation_ratio = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.099890] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] daemon = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.100067] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] debug = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.100227] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] default_access_ip_network_name = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.100391] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] default_availability_zone = nova {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.100546] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] default_ephemeral_format = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.100703] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] default_green_pool_size = 1000 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.100940] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.101115] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] default_schedule_zone = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.101272] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] disk_allocation_ratio = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.101433] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] enable_new_services = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.101609] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] enabled_apis = ['osapi_compute'] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.101769] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] enabled_ssl_apis = [] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.101925] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] flat_injected = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.102090] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] force_config_drive = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.102249] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] force_raw_images = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.102416] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] graceful_shutdown_timeout = 5 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.102572] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] heal_instance_info_cache_interval = 60 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.102785] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] host = cpu-1 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.102956] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.103133] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] initial_disk_allocation_ratio = 1.0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.103293] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] initial_ram_allocation_ratio = 1.0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.103503] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.103666] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] instance_build_timeout = 0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.103824] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] instance_delete_interval = 300 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.103991] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] instance_format = [instance: %(uuid)s] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.104171] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] instance_name_template = instance-%08x {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.104330] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] instance_usage_audit = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.104497] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] instance_usage_audit_period = month {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.104662] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.104827] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] instances_path = /opt/stack/data/nova/instances {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.104993] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] internal_service_availability_zone = internal {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.105408] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] key = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.105408] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] live_migration_retry_count = 30 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.105532] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] log_config_append = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.105629] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.105784] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] log_dir = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.105941] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] log_file = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.106076] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] log_options = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.106236] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] log_rotate_interval = 1 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.106401] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] log_rotate_interval_type = days {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.106565] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] log_rotation_type = none {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.106693] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.106819] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.106987] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.107163] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.107291] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.107450] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] long_rpc_timeout = 1800 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.107604] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] max_concurrent_builds = 10 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.107779] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] max_concurrent_live_migrations = 1 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.107948] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] max_concurrent_snapshots = 5 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.108121] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] max_local_block_devices = 3 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.108281] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] max_logfile_count = 30 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.108435] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] max_logfile_size_mb = 200 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.108591] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] maximum_instance_delete_attempts = 5 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.108760] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] metadata_listen = 0.0.0.0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.108952] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] metadata_listen_port = 8775 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.109139] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] metadata_workers = 2 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.109302] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] migrate_max_retries = -1 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.109468] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] mkisofs_cmd = genisoimage {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.109671] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] my_block_storage_ip = 10.180.1.21 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.109802] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] my_ip = 10.180.1.21 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.109962] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] network_allocate_retries = 0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.110150] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.110317] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] osapi_compute_listen = 0.0.0.0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.110479] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] osapi_compute_listen_port = 8774 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.110644] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] osapi_compute_unique_server_name_scope = {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.110808] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] osapi_compute_workers = 2 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.110968] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] password_length = 12 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.111139] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] periodic_enable = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.111297] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] periodic_fuzzy_delay = 60 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.111462] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] pointer_model = usbtablet {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.111626] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] preallocate_images = none {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.111783] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] publish_errors = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.111911] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] pybasedir = /opt/stack/nova {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.112076] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ram_allocation_ratio = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.112240] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] rate_limit_burst = 0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.112406] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] rate_limit_except_level = CRITICAL {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.112561] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] rate_limit_interval = 0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.112714] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] reboot_timeout = 0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.112868] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] reclaim_instance_interval = 0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.113031] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] record = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.113201] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] reimage_timeout_per_gb = 60 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.113365] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] report_interval = 120 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.113522] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] rescue_timeout = 0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.113680] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] reserved_host_cpus = 0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.113836] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] reserved_host_disk_mb = 0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.113998] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] reserved_host_memory_mb = 512 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.114160] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] reserved_huge_pages = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.114315] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] resize_confirm_window = 0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.114468] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] resize_fs_using_block_device = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.114624] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] resume_guests_state_on_host_boot = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.114790] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.114950] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] rpc_response_timeout = 60 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.115120] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] run_external_periodic_tasks = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.115290] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] running_deleted_instance_action = reap {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.115449] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] running_deleted_instance_poll_interval = 1800 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.115606] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] running_deleted_instance_timeout = 0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.115761] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] scheduler_instance_sync_interval = 120 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.115927] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] service_down_time = 720 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.116106] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] servicegroup_driver = db {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.116266] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] shelved_offload_time = 0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.116421] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] shelved_poll_interval = 3600 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.116587] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] shutdown_timeout = 0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.116747] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] source_is_ipv6 = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.116905] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ssl_only = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.117158] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.117326] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] sync_power_state_interval = 600 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.117488] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] sync_power_state_pool_size = 1000 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.117656] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] syslog_log_facility = LOG_USER {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.117837] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] tempdir = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.118009] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] timeout_nbd = 10 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.118184] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] transport_url = **** {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.118344] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] update_resources_interval = 0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.118500] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] use_cow_images = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.118659] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] use_eventlog = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.118878] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] use_journal = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.119068] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] use_json = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.119232] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] use_rootwrap_daemon = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.119391] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] use_stderr = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.119547] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] use_syslog = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.119700] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vcpu_pin_set = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.119869] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vif_plugging_is_fatal = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.120120] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vif_plugging_timeout = 300 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.120401] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] virt_mkfs = [] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.120619] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] volume_usage_poll_interval = 0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.120795] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] watch_log_file = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.120973] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] web = /usr/share/spice-html5 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 580.121246] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_concurrency.disable_process_locking = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.121568] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.121760] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.121932] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.122118] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.122294] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.122461] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.122645] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api.auth_strategy = keystone {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.122813] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api.compute_link_prefix = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.122991] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.123184] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api.dhcp_domain = novalocal {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.123353] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api.enable_instance_password = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.123517] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api.glance_link_prefix = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.123681] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.123851] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.124024] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api.instance_list_per_project_cells = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.124217] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api.list_records_by_skipping_down_cells = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.124390] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api.local_metadata_per_cell = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.124563] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api.max_limit = 1000 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.124731] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api.metadata_cache_expiration = 15 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.124908] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api.neutron_default_tenant_id = default {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.125085] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api.use_forwarded_for = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.125255] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api.use_neutron_default_nets = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.125424] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.125589] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.125752] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.125928] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.126112] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api.vendordata_dynamic_targets = [] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.126276] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api.vendordata_jsonfile_path = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.126455] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.126647] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.backend = dogpile.cache.memcached {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.126816] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.backend_argument = **** {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.126989] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.config_prefix = cache.oslo {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.127193] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.dead_timeout = 60.0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.127376] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.debug_cache_backend = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.127542] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.enable_retry_client = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.127718] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.enable_socket_keepalive = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.127911] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.enabled = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.128095] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.expiration_time = 600 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.128266] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.hashclient_retry_attempts = 2 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.128437] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.hashclient_retry_delay = 1.0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.128602] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.memcache_dead_retry = 300 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.128786] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.memcache_password = {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.128966] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.129147] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.129313] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.memcache_pool_maxsize = 10 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.129479] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.129642] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.memcache_sasl_enabled = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.129824] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.129998] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.memcache_socket_timeout = 1.0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.130202] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.memcache_username = {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.130388] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.proxies = [] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.130559] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.retry_attempts = 2 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.130729] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.retry_delay = 0.0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.130895] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.socket_keepalive_count = 1 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.131071] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.socket_keepalive_idle = 1 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.131238] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.socket_keepalive_interval = 1 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.131398] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.tls_allowed_ciphers = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.131556] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.tls_cafile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.131715] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.tls_certfile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.131879] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.tls_enabled = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.132046] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cache.tls_keyfile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.132222] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cinder.auth_section = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.132399] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cinder.auth_type = password {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.132564] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cinder.cafile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.132741] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cinder.catalog_info = volumev3::publicURL {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.132903] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cinder.certfile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.133079] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cinder.collect_timing = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.133265] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cinder.cross_az_attach = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.133442] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cinder.debug = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.133605] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cinder.endpoint_template = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.133770] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cinder.http_retries = 3 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.133937] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cinder.insecure = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.134112] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cinder.keyfile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.134288] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cinder.os_region_name = RegionOne {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.134452] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cinder.split_loggers = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.134612] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cinder.timeout = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.134784] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.134947] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] compute.cpu_dedicated_set = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.135127] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] compute.cpu_shared_set = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.135295] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] compute.image_type_exclude_list = [] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.135458] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.135622] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] compute.max_concurrent_disk_ops = 0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.135842] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] compute.max_disk_devices_to_attach = -1 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.136042] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.136273] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.136465] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] compute.resource_provider_association_refresh = 300 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.136635] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] compute.shutdown_retry_interval = 10 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.136820] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.137010] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] conductor.workers = 2 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.137198] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] console.allowed_origins = [] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.137361] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] console.ssl_ciphers = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.137532] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] console.ssl_minimum_version = default {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.137742] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] consoleauth.token_ttl = 600 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.137917] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cyborg.cafile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.138090] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cyborg.certfile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.138258] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cyborg.collect_timing = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.138421] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cyborg.connect_retries = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.138580] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cyborg.connect_retry_delay = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.138761] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cyborg.endpoint_override = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.138956] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cyborg.insecure = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.139139] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cyborg.keyfile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.139303] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cyborg.max_version = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.139462] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cyborg.min_version = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.139620] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cyborg.region_name = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.139779] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cyborg.service_name = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.139951] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cyborg.service_type = accelerator {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.140124] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cyborg.split_loggers = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.140282] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cyborg.status_code_retries = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.140440] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cyborg.status_code_retry_delay = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.140600] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cyborg.timeout = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.140779] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.140944] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] cyborg.version = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.141143] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] database.backend = sqlalchemy {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.141324] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] database.connection = **** {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.141497] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] database.connection_debug = 0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.141670] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] database.connection_parameters = {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.141861] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] database.connection_recycle_time = 3600 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.142054] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] database.connection_trace = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.142226] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] database.db_inc_retry_interval = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.142532] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] database.db_max_retries = 20 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.142584] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] database.db_max_retry_interval = 10 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.142716] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] database.db_retry_interval = 1 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.142888] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] database.max_overflow = 50 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.143073] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] database.max_pool_size = 5 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.143822] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] database.max_retries = 10 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.143822] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.143822] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] database.mysql_wsrep_sync_wait = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.143822] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] database.pool_timeout = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.144097] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] database.retry_interval = 10 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.144097] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] database.slave_connection = **** {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.144212] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] database.sqlite_synchronous = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.144382] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] database.use_db_reconnect = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.144577] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api_database.backend = sqlalchemy {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.144734] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api_database.connection = **** {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.144931] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api_database.connection_debug = 0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.145121] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api_database.connection_parameters = {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.145290] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api_database.connection_recycle_time = 3600 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.145458] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api_database.connection_trace = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.145620] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api_database.db_inc_retry_interval = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.145786] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api_database.db_max_retries = 20 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.145953] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api_database.db_max_retry_interval = 10 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.146126] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api_database.db_retry_interval = 1 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.146298] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api_database.max_overflow = 50 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.146458] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api_database.max_pool_size = 5 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.146650] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api_database.max_retries = 10 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.146798] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.146960] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.147137] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api_database.pool_timeout = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.147306] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api_database.retry_interval = 10 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.147467] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api_database.slave_connection = **** {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.147631] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] api_database.sqlite_synchronous = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.147838] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] devices.enabled_mdev_types = [] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.148042] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.148212] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ephemeral_storage_encryption.enabled = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.148378] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.148546] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] glance.api_servers = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.148710] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] glance.cafile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.148871] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] glance.certfile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.149042] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] glance.collect_timing = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.149204] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] glance.connect_retries = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.149362] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] glance.connect_retry_delay = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.149522] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] glance.debug = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.149686] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] glance.default_trusted_certificate_ids = [] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.149847] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] glance.enable_certificate_validation = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.150014] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] glance.enable_rbd_download = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.150175] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] glance.endpoint_override = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.150338] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] glance.insecure = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.150497] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] glance.keyfile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.150653] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] glance.max_version = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.150812] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] glance.min_version = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.150999] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] glance.num_retries = 3 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.151186] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] glance.rbd_ceph_conf = {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.151350] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] glance.rbd_connect_timeout = 5 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.151519] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] glance.rbd_pool = {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.151686] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] glance.rbd_user = {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.151850] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] glance.region_name = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.152017] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] glance.service_name = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.152193] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] glance.service_type = image {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.152358] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] glance.split_loggers = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.152518] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] glance.status_code_retries = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.152676] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] glance.status_code_retry_delay = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.152834] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] glance.timeout = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.153022] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.153190] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] glance.verify_glance_signatures = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.153350] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] glance.version = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.153517] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] guestfs.debug = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.153686] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] hyperv.config_drive_cdrom = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.153859] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] hyperv.config_drive_inject_password = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.154058] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.154231] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] hyperv.enable_instance_metrics_collection = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.154396] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] hyperv.enable_remotefx = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.154569] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] hyperv.instances_path_share = {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.154733] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] hyperv.iscsi_initiator_list = [] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.154894] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] hyperv.limit_cpu_features = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.155067] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.155231] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.155393] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] hyperv.power_state_check_timeframe = 60 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.155561] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.155732] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.155893] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] hyperv.use_multipath_io = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.156066] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] hyperv.volume_attach_retry_count = 10 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.156230] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.156390] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] hyperv.vswitch_name = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.156553] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.156719] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] mks.enabled = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.157109] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.157308] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] image_cache.manager_interval = 2400 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.157479] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] image_cache.precache_concurrency = 1 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.157651] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] image_cache.remove_unused_base_images = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.157853] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.158044] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.158230] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] image_cache.subdirectory_name = _base {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.158407] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ironic.api_max_retries = 60 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.158572] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ironic.api_retry_interval = 2 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.158756] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ironic.auth_section = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.158938] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ironic.auth_type = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.159122] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ironic.cafile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.159287] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ironic.certfile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.159450] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ironic.collect_timing = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.159612] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ironic.conductor_group = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.159780] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ironic.connect_retries = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.159962] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ironic.connect_retry_delay = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.160152] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ironic.endpoint_override = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.160319] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ironic.insecure = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.160480] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ironic.keyfile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.160638] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ironic.max_version = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.160798] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ironic.min_version = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.160963] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ironic.peer_list = [] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.161135] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ironic.region_name = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.161301] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ironic.serial_console_state_timeout = 10 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.161461] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ironic.service_name = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.161629] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ironic.service_type = baremetal {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.161791] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ironic.split_loggers = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.161951] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ironic.status_code_retries = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.162121] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ironic.status_code_retry_delay = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.162281] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ironic.timeout = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.162462] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.162622] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ironic.version = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.162829] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.163049] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] key_manager.fixed_key = **** {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.163251] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.163418] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] barbican.barbican_api_version = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.163578] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] barbican.barbican_endpoint = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.163749] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] barbican.barbican_endpoint_type = public {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.163908] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] barbican.barbican_region_name = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.164079] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] barbican.cafile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.164241] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] barbican.certfile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.164404] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] barbican.collect_timing = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.164568] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] barbican.insecure = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.164717] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] barbican.keyfile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.164879] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] barbican.number_of_retries = 60 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.165049] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] barbican.retry_delay = 1 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.165214] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] barbican.send_service_user_token = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.165375] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] barbican.split_loggers = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.165531] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] barbican.timeout = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.165700] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] barbican.verify_ssl = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.165878] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] barbican.verify_ssl_path = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.166056] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] barbican_service_user.auth_section = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.166225] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] barbican_service_user.auth_type = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.166385] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] barbican_service_user.cafile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.166543] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] barbican_service_user.certfile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.166704] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] barbican_service_user.collect_timing = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.166865] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] barbican_service_user.insecure = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.167030] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] barbican_service_user.keyfile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.167199] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] barbican_service_user.split_loggers = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.167357] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] barbican_service_user.timeout = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.167522] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vault.approle_role_id = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.167687] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vault.approle_secret_id = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.167874] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vault.cafile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.168047] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vault.certfile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.168216] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vault.collect_timing = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.168379] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vault.insecure = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.168540] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vault.keyfile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.168743] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vault.kv_mountpoint = secret {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.168929] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vault.kv_path = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.169113] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vault.kv_version = 2 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.169280] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vault.namespace = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.169440] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vault.root_token_id = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.169602] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vault.split_loggers = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.169763] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vault.ssl_ca_crt_file = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.169924] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vault.timeout = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.170096] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vault.use_ssl = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.170272] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.170442] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] keystone.auth_section = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.170605] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] keystone.auth_type = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.170764] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] keystone.cafile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.170925] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] keystone.certfile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.171100] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] keystone.collect_timing = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.171263] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] keystone.connect_retries = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.171424] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] keystone.connect_retry_delay = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.171585] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] keystone.endpoint_override = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.171759] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] keystone.insecure = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.171940] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] keystone.keyfile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.172115] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] keystone.max_version = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.172279] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] keystone.min_version = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.172443] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] keystone.region_name = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.172602] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] keystone.service_name = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.172772] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] keystone.service_type = identity {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.172937] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] keystone.split_loggers = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.173110] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] keystone.status_code_retries = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.173272] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] keystone.status_code_retry_delay = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.173432] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] keystone.timeout = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.173615] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.173779] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] keystone.version = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.173981] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.connection_uri = {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.174160] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.cpu_mode = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.174331] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.cpu_model_extra_flags = [] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.174502] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.cpu_models = [] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.174674] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.cpu_power_governor_high = performance {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.174880] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.cpu_power_governor_low = powersave {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.175074] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.cpu_power_management = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.175257] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.175424] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.device_detach_attempts = 8 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.175590] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.device_detach_timeout = 20 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.175756] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.disk_cachemodes = [] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.175919] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.disk_prefix = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.176109] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.enabled_perf_events = [] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.176808] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.file_backed_memory = 0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.176808] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.gid_maps = [] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.176808] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.hw_disk_discard = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.176808] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.hw_machine_type = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.176966] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.images_rbd_ceph_conf = {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.177061] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.177225] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.177394] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.images_rbd_glance_store_name = {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.177562] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.images_rbd_pool = rbd {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.177752] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.images_type = default {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.177940] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.images_volume_group = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.178124] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.inject_key = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.178289] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.inject_partition = -2 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.178449] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.inject_password = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.178610] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.iscsi_iface = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.178797] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.iser_use_multipath = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.178972] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.live_migration_bandwidth = 0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.179149] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.179311] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.live_migration_downtime = 500 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.179471] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.179641] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.179811] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.live_migration_inbound_addr = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.179971] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.180145] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.live_migration_permit_post_copy = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.180303] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.live_migration_scheme = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.180474] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.live_migration_timeout_action = abort {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.180633] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.live_migration_tunnelled = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.180792] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.live_migration_uri = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.180982] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.live_migration_with_native_tls = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.181160] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.max_queues = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.181325] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.181483] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.nfs_mount_options = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.181783] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.181960] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.182136] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.num_iser_scan_tries = 5 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.182296] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.num_memory_encrypted_guests = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.182464] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.182641] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.num_pcie_ports = 0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.182817] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.num_volume_scan_tries = 5 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.182981] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.pmem_namespaces = [] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.183154] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.quobyte_client_cfg = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.183448] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.183621] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.rbd_connect_timeout = 5 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.183786] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.183974] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.184157] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.rbd_secret_uuid = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.184321] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.rbd_user = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.184484] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.184658] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.remote_filesystem_transport = ssh {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.184820] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.rescue_image_id = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.185032] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.rescue_kernel_id = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.185235] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.rescue_ramdisk_id = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.185417] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.185582] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.rx_queue_size = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.185756] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.smbfs_mount_options = {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.186051] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.186228] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.snapshot_compression = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.186392] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.snapshot_image_format = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.186613] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.186781] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.sparse_logical_volumes = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.186969] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.swtpm_enabled = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.187172] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.swtpm_group = tss {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.187346] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.swtpm_user = tss {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.187519] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.sysinfo_serial = unique {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.187687] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.tb_cache_size = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.187870] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.tx_queue_size = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.188052] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.uid_maps = [] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.188221] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.use_virtio_for_bridges = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.188396] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.virt_type = kvm {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.188568] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.volume_clear = zero {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.188752] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.volume_clear_size = 0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.188934] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.volume_use_multipath = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.189108] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.vzstorage_cache_path = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.189285] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.189456] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.vzstorage_mount_group = qemu {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.189621] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.vzstorage_mount_opts = [] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.189795] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.190108] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.190297] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.vzstorage_mount_user = stack {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.190468] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.190643] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] neutron.auth_section = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.190821] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] neutron.auth_type = password {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.190989] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] neutron.cafile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.191168] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] neutron.certfile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.191334] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] neutron.collect_timing = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.191493] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] neutron.connect_retries = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.191654] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] neutron.connect_retry_delay = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.191828] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] neutron.default_floating_pool = public {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.191988] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] neutron.endpoint_override = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.192182] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] neutron.extension_sync_interval = 600 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.192331] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] neutron.http_retries = 3 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.192491] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] neutron.insecure = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.192650] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] neutron.keyfile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.192808] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] neutron.max_version = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.192999] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.193189] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] neutron.min_version = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.193367] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] neutron.ovs_bridge = br-int {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.193535] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] neutron.physnets = [] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.193709] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] neutron.region_name = RegionOne {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.193882] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] neutron.service_metadata_proxy = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.194055] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] neutron.service_name = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.194234] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] neutron.service_type = network {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.194403] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] neutron.split_loggers = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.194563] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] neutron.status_code_retries = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.194726] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] neutron.status_code_retry_delay = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.194890] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] neutron.timeout = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.195079] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.195244] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] neutron.version = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.195416] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] notifications.bdms_in_notifications = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.195596] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] notifications.default_level = INFO {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.195821] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] notifications.notification_format = unversioned {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.196028] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] notifications.notify_on_state_change = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.196218] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.196398] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] pci.alias = [] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.196575] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] pci.device_spec = [] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.196740] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] pci.report_in_placement = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.196915] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.auth_section = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.197102] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.auth_type = password {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.197277] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.auth_url = http://10.180.1.21/identity {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.197442] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.cafile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.197605] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.certfile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.197801] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.collect_timing = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.197977] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.connect_retries = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.198154] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.connect_retry_delay = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.198315] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.default_domain_id = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.198473] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.default_domain_name = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.198631] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.domain_id = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.198814] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.domain_name = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.198990] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.endpoint_override = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.199170] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.insecure = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.199331] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.keyfile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.199489] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.max_version = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.199646] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.min_version = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.199819] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.password = **** {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.200037] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.project_domain_id = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.200237] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.project_domain_name = Default {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.200415] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.project_id = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.200589] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.project_name = service {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.200761] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.region_name = RegionOne {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.200930] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.service_name = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.201114] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.service_type = placement {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.201282] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.split_loggers = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.201442] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.status_code_retries = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.201602] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.status_code_retry_delay = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.201774] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.system_scope = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.201958] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.timeout = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.202142] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.trust_id = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.202308] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.user_domain_id = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.202479] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.user_domain_name = Default {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.202641] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.user_id = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.202816] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.username = placement {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.202999] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.203178] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] placement.version = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.203358] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] quota.cores = 20 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.203523] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] quota.count_usage_from_placement = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.203696] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.203866] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] quota.injected_file_content_bytes = 10240 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.204043] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] quota.injected_file_path_length = 255 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.204218] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] quota.injected_files = 5 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.204388] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] quota.instances = 10 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.204562] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] quota.key_pairs = 100 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.204768] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] quota.metadata_items = 128 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.204980] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] quota.ram = 51200 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.205168] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] quota.recheck_quota = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.205342] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] quota.server_group_members = 10 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.205510] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] quota.server_groups = 10 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.205704] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] rdp.enabled = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.206036] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.206230] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.206403] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.206570] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] scheduler.image_metadata_prefilter = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.206737] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.206906] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] scheduler.max_attempts = 3 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.207083] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] scheduler.max_placement_results = 1000 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.207253] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.207417] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] scheduler.query_placement_for_image_type_support = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.207579] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.207775] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] scheduler.workers = 2 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.207963] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.208152] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.208336] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.208506] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.208673] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.208838] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.209014] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.209212] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.209381] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] filter_scheduler.host_subset_size = 1 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.209548] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.209711] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.209875] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.210047] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] filter_scheduler.isolated_hosts = [] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.210215] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] filter_scheduler.isolated_images = [] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.210376] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.210536] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.210699] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] filter_scheduler.num_instances_weight_multiplier = 0.0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.210863] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] filter_scheduler.pci_in_placement = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.211032] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.211198] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.211362] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.211525] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.211689] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.211855] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.212024] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] filter_scheduler.track_instance_changes = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.212209] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.212378] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] metrics.required = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.212544] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] metrics.weight_multiplier = 1.0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.212709] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.212878] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] metrics.weight_setting = [] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.213183] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.213363] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] serial_console.enabled = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.213542] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] serial_console.port_range = 10000:20000 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.213714] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.213885] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.214066] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] serial_console.serialproxy_port = 6083 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.214242] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] service_user.auth_section = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.214416] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] service_user.auth_type = password {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.214579] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] service_user.cafile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.214741] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] service_user.certfile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.214909] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] service_user.collect_timing = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.215108] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] service_user.insecure = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.215276] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] service_user.keyfile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.215463] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] service_user.send_service_user_token = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.215649] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] service_user.split_loggers = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.215823] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] service_user.timeout = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.215998] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] spice.agent_enabled = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.216178] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] spice.enabled = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.216471] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.216664] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.216836] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] spice.html5proxy_port = 6082 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.217007] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] spice.image_compression = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.217177] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] spice.jpeg_compression = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.217338] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] spice.playback_compression = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.217512] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] spice.server_listen = 127.0.0.1 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.217692] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.217879] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] spice.streaming_mode = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.218057] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] spice.zlib_compression = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.218232] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] upgrade_levels.baseapi = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.218394] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] upgrade_levels.cert = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.218567] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] upgrade_levels.compute = auto {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.218746] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] upgrade_levels.conductor = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.218921] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] upgrade_levels.scheduler = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.219104] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vendordata_dynamic_auth.auth_section = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.219273] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vendordata_dynamic_auth.auth_type = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.219436] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vendordata_dynamic_auth.cafile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.219595] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vendordata_dynamic_auth.certfile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.219758] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.219923] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vendordata_dynamic_auth.insecure = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.220095] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vendordata_dynamic_auth.keyfile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.220260] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.220418] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vendordata_dynamic_auth.timeout = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.220593] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vmware.api_retry_count = 10 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.220754] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vmware.ca_file = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.220929] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vmware.cache_prefix = devstack-image-cache {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.221109] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vmware.cluster_name = testcl1 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.221279] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vmware.connection_pool_size = 10 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.221440] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vmware.console_delay_seconds = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.221611] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vmware.datastore_regex = ^datastore.* {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.221822] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.221997] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vmware.host_password = **** {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.222179] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vmware.host_port = 443 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.222352] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vmware.host_username = administrator@vsphere.local {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.222521] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vmware.insecure = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.222684] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vmware.integration_bridge = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.222853] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vmware.maximum_objects = 100 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.223024] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vmware.pbm_default_policy = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.223191] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vmware.pbm_enabled = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.223350] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vmware.pbm_wsdl_location = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.223522] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.223685] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vmware.serial_port_proxy_uri = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.223851] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vmware.serial_port_service_uri = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.224023] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vmware.task_poll_interval = 0.5 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.224203] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vmware.use_linked_clone = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.224376] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vmware.vnc_keymap = en-us {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.224544] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vmware.vnc_port = 5900 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.224709] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vmware.vnc_port_total = 10000 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.224901] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vnc.auth_schemes = ['none'] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.225169] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vnc.enabled = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.225499] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.225716] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.225903] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vnc.novncproxy_port = 6080 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.226100] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vnc.server_listen = 127.0.0.1 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.226281] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.226449] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vnc.vencrypt_ca_certs = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.226611] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vnc.vencrypt_client_cert = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.226774] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vnc.vencrypt_client_key = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.226950] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.227128] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] workarounds.disable_deep_image_inspection = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.227292] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.227454] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] workarounds.disable_group_policy_check_upcall = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.227615] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.227806] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] workarounds.disable_rootwrap = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.227977] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] workarounds.enable_numa_live_migration = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.228156] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.228318] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.228479] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.228638] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] workarounds.libvirt_disable_apic = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.228820] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.228993] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.229171] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.229331] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.229492] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.229653] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.229811] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.229973] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.230143] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.230308] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.230492] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.230662] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] wsgi.client_socket_timeout = 900 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.230829] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] wsgi.default_pool_size = 1000 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.230995] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] wsgi.keep_alive = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.231176] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] wsgi.max_header_line = 16384 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.231340] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] wsgi.secure_proxy_ssl_header = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.231500] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] wsgi.ssl_ca_file = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.231661] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] wsgi.ssl_cert_file = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.231822] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] wsgi.ssl_key_file = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.231993] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] wsgi.tcp_keepidle = 600 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.232181] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.232351] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] zvm.ca_file = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.232512] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] zvm.cloud_connector_url = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.232809] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.232988] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] zvm.reachable_timeout = 300 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.233184] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_policy.enforce_new_defaults = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.233357] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_policy.enforce_scope = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.233536] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_policy.policy_default_rule = default {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.233721] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.233900] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_policy.policy_file = policy.yaml {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.234090] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.234257] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.234419] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.234577] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.234739] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.234910] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.235099] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.235283] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] profiler.connection_string = messaging:// {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.235452] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] profiler.enabled = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.235648] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] profiler.es_doc_type = notification {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.235832] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] profiler.es_scroll_size = 10000 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.236018] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] profiler.es_scroll_time = 2m {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.236193] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] profiler.filter_error_trace = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.236364] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] profiler.hmac_keys = **** {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.236531] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] profiler.sentinel_service_name = mymaster {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.236698] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] profiler.socket_timeout = 0.1 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.236860] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] profiler.trace_requests = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.237029] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] profiler.trace_sqlalchemy = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.237210] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] profiler_jaeger.process_tags = {} {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.237370] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] profiler_jaeger.service_name_prefix = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.237534] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] profiler_otlp.service_name_prefix = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.237715] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] remote_debug.host = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.237894] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] remote_debug.port = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.238087] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.238257] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.238425] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.238589] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.238769] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.238942] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.239120] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.239285] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.239445] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.239604] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.239778] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.239949] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.240138] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.240305] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.240468] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.240642] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.240806] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.240969] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.241145] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.241312] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.241470] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.241631] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.241789] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.241949] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.242128] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.242294] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.ssl = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.242468] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.242640] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.242804] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.242979] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.243167] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_rabbit.ssl_version = {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.243359] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.243527] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_notifications.retry = -1 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.243711] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.243888] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_messaging_notifications.transport_url = **** {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.244070] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_limit.auth_section = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.244238] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_limit.auth_type = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.244398] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_limit.cafile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.244568] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_limit.certfile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.244735] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_limit.collect_timing = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.244896] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_limit.connect_retries = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.245064] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_limit.connect_retry_delay = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.245228] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_limit.endpoint_id = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.245389] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_limit.endpoint_override = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.245547] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_limit.insecure = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.245730] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_limit.keyfile = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.245897] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_limit.max_version = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.246066] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_limit.min_version = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.246229] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_limit.region_name = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.246386] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_limit.service_name = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.246543] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_limit.service_type = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.246702] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_limit.split_loggers = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.246860] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_limit.status_code_retries = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.247028] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_limit.status_code_retry_delay = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.247191] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_limit.timeout = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.247348] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_limit.valid_interfaces = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.247505] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_limit.version = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.247673] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_reports.file_event_handler = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.247861] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.248036] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] oslo_reports.log_dir = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.248213] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.248372] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.248530] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.248707] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.248888] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.249061] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.249236] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.249399] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vif_plug_ovs_privileged.group = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.249558] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.249722] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.249885] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.250051] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] vif_plug_ovs_privileged.user = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.250225] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] os_vif_linux_bridge.flat_interface = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.250406] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.250580] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.250751] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.250924] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.251105] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.251273] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.251436] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.251612] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] os_vif_ovs.default_qos_type = linux-noop {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.251785] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] os_vif_ovs.isolate_vif = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.251955] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.252134] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.252305] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.252474] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] os_vif_ovs.ovsdb_interface = native {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.252635] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] os_vif_ovs.per_port_bridge = False {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.252802] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] os_brick.lock_path = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.252967] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] os_brick.wait_mpath_device_attempts = 4 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.253140] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] os_brick.wait_mpath_device_interval = 1 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.253310] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] privsep_osbrick.capabilities = [21] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.253469] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] privsep_osbrick.group = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.253627] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] privsep_osbrick.helper_command = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.253790] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.253953] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] privsep_osbrick.thread_pool_size = 8 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.254124] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] privsep_osbrick.user = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.254329] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.254495] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] nova_sys_admin.group = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.254655] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] nova_sys_admin.helper_command = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.254821] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.254985] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] nova_sys_admin.thread_pool_size = 8 {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.255158] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] nova_sys_admin.user = None {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 580.255289] env[67893]: DEBUG oslo_service.service [None req-65f92407-72ed-4862-956e-dd941a8dbcd8 None None] ******************************************************************************** {{(pid=67893) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2613}} [ 580.255732] env[67893]: INFO nova.service [-] Starting compute node (version 0.0.1) [ 580.266190] env[67893]: WARNING nova.virt.vmwareapi.driver [None req-65baacb7-fd4e-4b5a-802c-d293d7a07291 None None] The vmwareapi driver is not tested by the OpenStack project nor does it have clear maintainer(s) and thus its quality can not be ensured. It should be considered experimental and may be removed in a future release. If you are using the driver in production please let us know via the openstack-discuss mailing list. [ 580.266635] env[67893]: INFO nova.virt.node [None req-65baacb7-fd4e-4b5a-802c-d293d7a07291 None None] Generated node identity 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 [ 580.266857] env[67893]: INFO nova.virt.node [None req-65baacb7-fd4e-4b5a-802c-d293d7a07291 None None] Wrote node identity 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 to /opt/stack/data/n-cpu-1/compute_id [ 580.280183] env[67893]: WARNING nova.compute.manager [None req-65baacb7-fd4e-4b5a-802c-d293d7a07291 None None] Compute nodes ['17b8bcc7-ce4b-4d4d-b863-33b2251dfd57'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 580.313016] env[67893]: INFO nova.compute.manager [None req-65baacb7-fd4e-4b5a-802c-d293d7a07291 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 580.332861] env[67893]: WARNING nova.compute.manager [None req-65baacb7-fd4e-4b5a-802c-d293d7a07291 None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 580.333098] env[67893]: DEBUG oslo_concurrency.lockutils [None req-65baacb7-fd4e-4b5a-802c-d293d7a07291 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 580.333308] env[67893]: DEBUG oslo_concurrency.lockutils [None req-65baacb7-fd4e-4b5a-802c-d293d7a07291 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 580.333450] env[67893]: DEBUG oslo_concurrency.lockutils [None req-65baacb7-fd4e-4b5a-802c-d293d7a07291 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 580.333599] env[67893]: DEBUG nova.compute.resource_tracker [None req-65baacb7-fd4e-4b5a-802c-d293d7a07291 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 580.334733] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44857357-7901-4dd6-bc13-c931c8479992 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.343220] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c747e7d7-f9c0-487f-9bbc-5a6b2e444055 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.357873] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99a077c6-3bb3-4ba8-a8d3-322ea47d5393 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.363975] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac71da69-8323-4225-88a3-1e5a3825ccf9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.393095] env[67893]: DEBUG nova.compute.resource_tracker [None req-65baacb7-fd4e-4b5a-802c-d293d7a07291 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180954MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 580.393245] env[67893]: DEBUG oslo_concurrency.lockutils [None req-65baacb7-fd4e-4b5a-802c-d293d7a07291 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 580.393422] env[67893]: DEBUG oslo_concurrency.lockutils [None req-65baacb7-fd4e-4b5a-802c-d293d7a07291 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 580.406188] env[67893]: WARNING nova.compute.resource_tracker [None req-65baacb7-fd4e-4b5a-802c-d293d7a07291 None None] No compute node record for cpu-1:17b8bcc7-ce4b-4d4d-b863-33b2251dfd57: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 could not be found. [ 580.419169] env[67893]: INFO nova.compute.resource_tracker [None req-65baacb7-fd4e-4b5a-802c-d293d7a07291 None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 [ 580.468892] env[67893]: DEBUG nova.compute.resource_tracker [None req-65baacb7-fd4e-4b5a-802c-d293d7a07291 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 580.469139] env[67893]: DEBUG nova.compute.resource_tracker [None req-65baacb7-fd4e-4b5a-802c-d293d7a07291 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 580.573037] env[67893]: INFO nova.scheduler.client.report [None req-65baacb7-fd4e-4b5a-802c-d293d7a07291 None None] [req-9a107de8-56ae-41b5-a7ac-732873a46eb2] Created resource provider record via placement API for resource provider with UUID 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 580.589779] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b9f7925-ec7a-4a1b-aae1-7e8294f69241 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.597097] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00056e6c-7b0c-4668-b6a8-df214456a2d6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.626245] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6da79a9d-3bd6-4d8a-a9e5-c99f699e3498 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.633143] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf96d74c-9998-4970-bba9-fef1a402219b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.645613] env[67893]: DEBUG nova.compute.provider_tree [None req-65baacb7-fd4e-4b5a-802c-d293d7a07291 None None] Updating inventory in ProviderTree for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 580.684327] env[67893]: DEBUG nova.scheduler.client.report [None req-65baacb7-fd4e-4b5a-802c-d293d7a07291 None None] Updated inventory for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 580.684564] env[67893]: DEBUG nova.compute.provider_tree [None req-65baacb7-fd4e-4b5a-802c-d293d7a07291 None None] Updating resource provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 generation from 0 to 1 during operation: update_inventory {{(pid=67893) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 580.684707] env[67893]: DEBUG nova.compute.provider_tree [None req-65baacb7-fd4e-4b5a-802c-d293d7a07291 None None] Updating inventory in ProviderTree for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 580.732508] env[67893]: DEBUG nova.compute.provider_tree [None req-65baacb7-fd4e-4b5a-802c-d293d7a07291 None None] Updating resource provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 generation from 1 to 2 during operation: update_traits {{(pid=67893) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 580.749850] env[67893]: DEBUG nova.compute.resource_tracker [None req-65baacb7-fd4e-4b5a-802c-d293d7a07291 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 580.750245] env[67893]: DEBUG oslo_concurrency.lockutils [None req-65baacb7-fd4e-4b5a-802c-d293d7a07291 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.357s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 580.750245] env[67893]: DEBUG nova.service [None req-65baacb7-fd4e-4b5a-802c-d293d7a07291 None None] Creating RPC server for service compute {{(pid=67893) start /opt/stack/nova/nova/service.py:182}} [ 580.762259] env[67893]: DEBUG nova.service [None req-65baacb7-fd4e-4b5a-802c-d293d7a07291 None None] Join ServiceGroup membership for this service compute {{(pid=67893) start /opt/stack/nova/nova/service.py:199}} [ 580.762442] env[67893]: DEBUG nova.servicegroup.drivers.db [None req-65baacb7-fd4e-4b5a-802c-d293d7a07291 None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=67893) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 590.092408] env[67893]: DEBUG dbcounter [-] [67893] Writing DB stats nova_cell0:SELECT=1 {{(pid=67893) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 590.093106] env[67893]: DEBUG dbcounter [-] [67893] Writing DB stats nova_cell1:SELECT=1 {{(pid=67893) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 590.764256] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._sync_power_states {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 590.774081] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Getting list of instances from cluster (obj){ [ 590.774081] env[67893]: value = "domain-c8" [ 590.774081] env[67893]: _type = "ClusterComputeResource" [ 590.774081] env[67893]: } {{(pid=67893) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 590.775197] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-910e2157-ce62-4b00-ad73-a95ae4094d10 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.784241] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Got total of 0 instances {{(pid=67893) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 590.784467] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 590.784770] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Getting list of instances from cluster (obj){ [ 590.784770] env[67893]: value = "domain-c8" [ 590.784770] env[67893]: _type = "ClusterComputeResource" [ 590.784770] env[67893]: } {{(pid=67893) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 590.785598] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7299a6da-948f-421b-b6ca-4bba3e19fb3f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.792842] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Got total of 0 instances {{(pid=67893) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 617.358347] env[67893]: DEBUG oslo_concurrency.lockutils [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Acquiring lock "0d074bfa-7d3d-4e69-b544-36e7d9f79483" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 617.358779] env[67893]: DEBUG oslo_concurrency.lockutils [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Lock "0d074bfa-7d3d-4e69-b544-36e7d9f79483" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 617.379939] env[67893]: DEBUG nova.compute.manager [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 617.500125] env[67893]: DEBUG oslo_concurrency.lockutils [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 617.500125] env[67893]: DEBUG oslo_concurrency.lockutils [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 617.500125] env[67893]: INFO nova.compute.claims [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 617.655948] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc8e1a59-1707-4e94-9048-fd7256142ef2 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 617.664383] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65e9208b-1cd0-4b01-ae49-c27e0dd08716 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 617.701041] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7271a7c-1d01-4bef-8dad-a3ed3757223a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 617.709182] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59251257-cea7-45c6-b9fa-7cfb98087c4e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 617.724681] env[67893]: DEBUG nova.compute.provider_tree [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 617.741510] env[67893]: DEBUG nova.scheduler.client.report [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 617.765216] env[67893]: DEBUG oslo_concurrency.lockutils [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.267s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 617.765932] env[67893]: DEBUG nova.compute.manager [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 617.812577] env[67893]: DEBUG nova.compute.utils [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 617.814492] env[67893]: DEBUG nova.compute.manager [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 617.814869] env[67893]: DEBUG nova.network.neutron [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 617.831018] env[67893]: DEBUG nova.compute.manager [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 617.912029] env[67893]: DEBUG nova.compute.manager [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 619.777505] env[67893]: DEBUG nova.virt.hardware [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 619.778096] env[67893]: DEBUG nova.virt.hardware [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 619.778427] env[67893]: DEBUG nova.virt.hardware [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 619.778513] env[67893]: DEBUG nova.virt.hardware [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 619.778659] env[67893]: DEBUG nova.virt.hardware [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 619.778812] env[67893]: DEBUG nova.virt.hardware [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 619.779125] env[67893]: DEBUG nova.virt.hardware [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 619.779243] env[67893]: DEBUG nova.virt.hardware [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 619.779583] env[67893]: DEBUG nova.virt.hardware [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 619.780151] env[67893]: DEBUG nova.virt.hardware [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 619.780733] env[67893]: DEBUG nova.virt.hardware [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 619.781483] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c11fdca8-874e-4e99-a666-af466f472ab3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 619.792677] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29c813e7-2cab-48dd-9d07-b4c53d0c1a52 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 619.814059] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef5fdc06-e6a8-4c70-850a-bd0353b57ef6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 620.352612] env[67893]: DEBUG nova.policy [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6f96a115e8bc4a228b053c119dca37a1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c5df28be12c7480e9a5d729aa987cd43', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 622.780603] env[67893]: DEBUG nova.network.neutron [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Successfully created port: 45705223-1cd2-4770-b083-9cd313b2aa10 {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 626.170521] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Acquiring lock "d9e47a83-7921-4cf6-ba99-fb705bc52e4a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 626.170793] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Lock "d9e47a83-7921-4cf6-ba99-fb705bc52e4a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 626.191958] env[67893]: DEBUG nova.compute.manager [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 626.261861] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 626.262139] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 626.263931] env[67893]: INFO nova.compute.claims [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 626.391996] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f8f2523-b74c-4e49-81bc-4063b0244021 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 626.400738] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c6b6357-4331-405c-8b48-1018d01ae64b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 626.433846] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89c63ea4-f3ca-404e-a2fb-31eb00d912fc {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 626.445401] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1173c84-5799-476c-bdaf-52c7b59f87ae {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 626.460493] env[67893]: DEBUG nova.compute.provider_tree [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 626.470928] env[67893]: DEBUG nova.scheduler.client.report [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 626.490953] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.229s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 626.491643] env[67893]: DEBUG nova.compute.manager [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 626.543283] env[67893]: DEBUG nova.compute.utils [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 626.545133] env[67893]: DEBUG nova.compute.manager [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 626.547131] env[67893]: DEBUG nova.network.neutron [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 626.562708] env[67893]: DEBUG nova.compute.manager [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 626.665599] env[67893]: DEBUG nova.compute.manager [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 626.698924] env[67893]: DEBUG nova.virt.hardware [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 626.699061] env[67893]: DEBUG nova.virt.hardware [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 626.699473] env[67893]: DEBUG nova.virt.hardware [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 626.699473] env[67893]: DEBUG nova.virt.hardware [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 626.700067] env[67893]: DEBUG nova.virt.hardware [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 626.700067] env[67893]: DEBUG nova.virt.hardware [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 626.700067] env[67893]: DEBUG nova.virt.hardware [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 626.700067] env[67893]: DEBUG nova.virt.hardware [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 626.700222] env[67893]: DEBUG nova.virt.hardware [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 626.700320] env[67893]: DEBUG nova.virt.hardware [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 626.700474] env[67893]: DEBUG nova.virt.hardware [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 626.701444] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a2e2999-d35b-4629-ab8c-b76c2123b3aa {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 626.709942] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1dc40413-bdba-44ab-9621-b40c086e3a75 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 626.997612] env[67893]: DEBUG nova.network.neutron [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Successfully updated port: 45705223-1cd2-4770-b083-9cd313b2aa10 {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 627.023050] env[67893]: DEBUG oslo_concurrency.lockutils [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Acquiring lock "refresh_cache-0d074bfa-7d3d-4e69-b544-36e7d9f79483" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 627.025019] env[67893]: DEBUG oslo_concurrency.lockutils [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Acquired lock "refresh_cache-0d074bfa-7d3d-4e69-b544-36e7d9f79483" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 627.025019] env[67893]: DEBUG nova.network.neutron [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 627.047672] env[67893]: DEBUG nova.policy [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fd8c46057f074e15ba87d6e130f70a7a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '60b875938b85447ea8a3a60b7be31f2c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 627.204918] env[67893]: DEBUG nova.network.neutron [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 627.317252] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Acquiring lock "30d52736-4195-4767-89e0-8572dc96de29" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 627.317438] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Lock "30d52736-4195-4767-89e0-8572dc96de29" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 627.335327] env[67893]: DEBUG nova.compute.manager [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 627.416373] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 627.416618] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 627.418647] env[67893]: INFO nova.compute.claims [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 627.587972] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1376b35b-05e5-4f4c-8674-f7e3f9ab2ce9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 627.600755] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88104108-965e-4f7e-a351-4588c4c9c73d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 627.647642] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d419eb3-e43c-4ebe-872c-9297c8054daa {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 627.657823] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f9f10df-f570-44b1-9fb1-43f3a9a16e25 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 627.677402] env[67893]: DEBUG nova.compute.provider_tree [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 627.700462] env[67893]: DEBUG nova.scheduler.client.report [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 627.722544] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.306s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 627.723118] env[67893]: DEBUG nova.compute.manager [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 627.766094] env[67893]: DEBUG nova.compute.utils [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 627.767646] env[67893]: DEBUG nova.compute.manager [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 627.767784] env[67893]: DEBUG nova.network.neutron [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 627.779678] env[67893]: DEBUG nova.compute.manager [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 627.856928] env[67893]: DEBUG nova.compute.manager [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 627.895106] env[67893]: DEBUG nova.virt.hardware [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 627.895353] env[67893]: DEBUG nova.virt.hardware [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 627.895506] env[67893]: DEBUG nova.virt.hardware [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 627.895679] env[67893]: DEBUG nova.virt.hardware [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 627.898504] env[67893]: DEBUG nova.virt.hardware [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 627.898683] env[67893]: DEBUG nova.virt.hardware [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 627.898940] env[67893]: DEBUG nova.virt.hardware [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 627.899163] env[67893]: DEBUG nova.virt.hardware [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 627.899362] env[67893]: DEBUG nova.virt.hardware [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 627.899526] env[67893]: DEBUG nova.virt.hardware [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 627.899693] env[67893]: DEBUG nova.virt.hardware [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 627.900948] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61807a16-a7b5-4342-b874-d15c8e52134d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 627.909974] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bf8be41-73ea-45df-ad20-e0fcb5d03915 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 628.155786] env[67893]: DEBUG nova.policy [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'aca147a8718d43ec821b0a992bbf756d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '71be834fce6a435ca74bd0b99b2ee3df', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 628.193032] env[67893]: DEBUG nova.compute.manager [req-aa87d326-d98f-4fbe-af8a-42956dbab389 req-3d1a5d80-b2fb-477a-8a3d-665b2995c387 service nova] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Received event network-vif-plugged-45705223-1cd2-4770-b083-9cd313b2aa10 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 628.193032] env[67893]: DEBUG oslo_concurrency.lockutils [req-aa87d326-d98f-4fbe-af8a-42956dbab389 req-3d1a5d80-b2fb-477a-8a3d-665b2995c387 service nova] Acquiring lock "0d074bfa-7d3d-4e69-b544-36e7d9f79483-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 628.193032] env[67893]: DEBUG oslo_concurrency.lockutils [req-aa87d326-d98f-4fbe-af8a-42956dbab389 req-3d1a5d80-b2fb-477a-8a3d-665b2995c387 service nova] Lock "0d074bfa-7d3d-4e69-b544-36e7d9f79483-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 628.193032] env[67893]: DEBUG oslo_concurrency.lockutils [req-aa87d326-d98f-4fbe-af8a-42956dbab389 req-3d1a5d80-b2fb-477a-8a3d-665b2995c387 service nova] Lock "0d074bfa-7d3d-4e69-b544-36e7d9f79483-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 628.193377] env[67893]: DEBUG nova.compute.manager [req-aa87d326-d98f-4fbe-af8a-42956dbab389 req-3d1a5d80-b2fb-477a-8a3d-665b2995c387 service nova] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] No waiting events found dispatching network-vif-plugged-45705223-1cd2-4770-b083-9cd313b2aa10 {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 628.193377] env[67893]: WARNING nova.compute.manager [req-aa87d326-d98f-4fbe-af8a-42956dbab389 req-3d1a5d80-b2fb-477a-8a3d-665b2995c387 service nova] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Received unexpected event network-vif-plugged-45705223-1cd2-4770-b083-9cd313b2aa10 for instance with vm_state building and task_state spawning. [ 628.671042] env[67893]: DEBUG nova.network.neutron [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Updating instance_info_cache with network_info: [{"id": "45705223-1cd2-4770-b083-9cd313b2aa10", "address": "fa:16:3e:cd:28:cf", "network": {"id": "0158027f-656f-43e3-aebf-7cbb75cfd948", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.207", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "304a5519bb7c46efb34a42749d9cf409", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a8b99a46-3e7f-4ef1-9e45-58e6cd17f210", "external-id": "nsx-vlan-transportzone-704", "segmentation_id": 704, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap45705223-1c", "ovs_interfaceid": "45705223-1cd2-4770-b083-9cd313b2aa10", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 628.691608] env[67893]: DEBUG oslo_concurrency.lockutils [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Releasing lock "refresh_cache-0d074bfa-7d3d-4e69-b544-36e7d9f79483" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 628.691608] env[67893]: DEBUG nova.compute.manager [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Instance network_info: |[{"id": "45705223-1cd2-4770-b083-9cd313b2aa10", "address": "fa:16:3e:cd:28:cf", "network": {"id": "0158027f-656f-43e3-aebf-7cbb75cfd948", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.207", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "304a5519bb7c46efb34a42749d9cf409", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a8b99a46-3e7f-4ef1-9e45-58e6cd17f210", "external-id": "nsx-vlan-transportzone-704", "segmentation_id": 704, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap45705223-1c", "ovs_interfaceid": "45705223-1cd2-4770-b083-9cd313b2aa10", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 628.691738] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:cd:28:cf', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a8b99a46-3e7f-4ef1-9e45-58e6cd17f210', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '45705223-1cd2-4770-b083-9cd313b2aa10', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 628.708351] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 628.710149] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2da6024e-0da6-47d6-a08d-25018bbc6f4a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 628.722271] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Created folder: OpenStack in parent group-v4. [ 628.722466] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Creating folder: Project (c5df28be12c7480e9a5d729aa987cd43). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 628.722734] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ffb81b31-ef46-430d-b7de-8e06af6011dd {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 628.733672] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Created folder: Project (c5df28be12c7480e9a5d729aa987cd43) in parent group-v689771. [ 628.734089] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Creating folder: Instances. Parent ref: group-v689772. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 628.734089] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d2e6bf86-117b-44fa-89a1-d8044e6f5cc1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 628.743785] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Created folder: Instances in parent group-v689772. [ 628.743785] env[67893]: DEBUG oslo.service.loopingcall [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 628.743865] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 628.744583] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-53bf5cfc-b698-47e9-a702-7843ce6ded10 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 628.771204] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 628.771204] env[67893]: value = "task-3455289" [ 628.771204] env[67893]: _type = "Task" [ 628.771204] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 628.783186] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455289, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 628.996194] env[67893]: DEBUG oslo_concurrency.lockutils [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Acquiring lock "3e67c74f-5c03-4dc4-a23b-b547bfb32b4a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 628.996194] env[67893]: DEBUG oslo_concurrency.lockutils [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Lock "3e67c74f-5c03-4dc4-a23b-b547bfb32b4a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 629.017679] env[67893]: DEBUG nova.compute.manager [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 629.103274] env[67893]: DEBUG oslo_concurrency.lockutils [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 629.103493] env[67893]: DEBUG oslo_concurrency.lockutils [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 629.105659] env[67893]: INFO nova.compute.claims [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 629.284326] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455289, 'name': CreateVM_Task, 'duration_secs': 0.491208} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 629.289599] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 629.307683] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d26cc790-05a9-4060-aca5-adc2b463a768 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 629.316225] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-218251d0-cc6c-4437-b93b-23cd23aa997f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 629.353973] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f743d0bf-86c3-4253-b76b-2e02c4448eba {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 629.362100] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c61aac92-ef8a-4ba1-9662-25c8ff73f76b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 629.377928] env[67893]: DEBUG nova.compute.provider_tree [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 629.399537] env[67893]: DEBUG nova.scheduler.client.report [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 629.418428] env[67893]: DEBUG oslo_concurrency.lockutils [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.315s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 629.419149] env[67893]: DEBUG nova.compute.manager [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 629.470840] env[67893]: DEBUG nova.compute.utils [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 629.471987] env[67893]: DEBUG nova.compute.manager [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 629.472178] env[67893]: DEBUG nova.network.neutron [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 629.486355] env[67893]: DEBUG oslo_vmware.service [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aaa8b9fc-8c74-4013-9a6b-d28b9e606024 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 629.491837] env[67893]: DEBUG nova.compute.manager [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 629.500857] env[67893]: DEBUG oslo_concurrency.lockutils [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 629.500857] env[67893]: DEBUG oslo_concurrency.lockutils [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 629.500946] env[67893]: DEBUG oslo_concurrency.lockutils [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 629.501479] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-03bc1137-1019-4817-b71e-25dcfed1e520 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 629.510917] env[67893]: DEBUG oslo_vmware.api [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Waiting for the task: (returnval){ [ 629.510917] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]520b6263-265f-77d1-6a8c-41f2b524f30b" [ 629.510917] env[67893]: _type = "Task" [ 629.510917] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 629.528470] env[67893]: DEBUG oslo_vmware.api [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]520b6263-265f-77d1-6a8c-41f2b524f30b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 629.624137] env[67893]: DEBUG nova.compute.manager [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 629.651149] env[67893]: DEBUG nova.virt.hardware [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 629.651403] env[67893]: DEBUG nova.virt.hardware [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 629.651541] env[67893]: DEBUG nova.virt.hardware [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 629.651708] env[67893]: DEBUG nova.virt.hardware [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 629.651847] env[67893]: DEBUG nova.virt.hardware [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 629.651995] env[67893]: DEBUG nova.virt.hardware [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 629.652332] env[67893]: DEBUG nova.virt.hardware [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 629.652490] env[67893]: DEBUG nova.virt.hardware [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 629.652682] env[67893]: DEBUG nova.virt.hardware [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 629.652803] env[67893]: DEBUG nova.virt.hardware [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 629.652966] env[67893]: DEBUG nova.virt.hardware [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 629.653937] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f62f8ea1-3b73-41d7-9a0d-0037583c27cd {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 629.663391] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4e42d59-02fd-4830-a8c2-8e0ba6847f03 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 629.973332] env[67893]: DEBUG nova.network.neutron [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Successfully created port: 88a3ec34-f85d-4b1a-8407-d70197c799a7 {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 630.023416] env[67893]: DEBUG oslo_concurrency.lockutils [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 630.023416] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 630.023690] env[67893]: DEBUG oslo_concurrency.lockutils [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 630.023827] env[67893]: DEBUG oslo_concurrency.lockutils [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 630.024365] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 630.024543] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f03afc4b-e0ef-4b21-aa85-ba4dc30cf046 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 630.045510] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 630.045510] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 630.045510] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc4b38a9-63e3-42a7-890e-4226f79f0724 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 630.053563] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3749f3cc-d75f-444e-af9c-4589e6a8e37c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 630.059593] env[67893]: DEBUG oslo_vmware.api [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Waiting for the task: (returnval){ [ 630.059593] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]527700cd-c4fd-9540-f333-b13a41613800" [ 630.059593] env[67893]: _type = "Task" [ 630.059593] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 630.067928] env[67893]: DEBUG oslo_vmware.api [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]527700cd-c4fd-9540-f333-b13a41613800, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 630.098919] env[67893]: DEBUG nova.policy [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '55ec8ebc2e494d7199170003a1766754', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '78817a6a194f4038a2d8c6bdac194466', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 630.575105] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 630.576010] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Creating directory with path [datastore1] vmware_temp/5c9cbb4b-6bc7-425b-96fc-9cdd49e101e7/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 630.576562] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5f83ebac-0461-4cc0-a4d0-c40d75214cc9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 630.593673] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Created directory with path [datastore1] vmware_temp/5c9cbb4b-6bc7-425b-96fc-9cdd49e101e7/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 630.594012] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Fetch image to [datastore1] vmware_temp/5c9cbb4b-6bc7-425b-96fc-9cdd49e101e7/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 630.594200] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/5c9cbb4b-6bc7-425b-96fc-9cdd49e101e7/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 630.594993] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f19cf31c-4732-41eb-8e1c-1f752d801a56 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 630.607144] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aacc2dcb-fdc2-47eb-bf94-89031b4b6885 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 630.617319] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8637dda3-6cf5-48d9-8d75-b9f16dbb2862 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 630.651910] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4155539d-e312-48bd-8ed1-88f24736cc17 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 630.658783] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1b9f9f02-e74e-4888-82f2-d3cd45a346f1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 630.696794] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 630.781027] env[67893]: DEBUG oslo_vmware.rw_handles [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5c9cbb4b-6bc7-425b-96fc-9cdd49e101e7/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 630.847011] env[67893]: DEBUG oslo_vmware.rw_handles [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 630.847237] env[67893]: DEBUG oslo_vmware.rw_handles [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5c9cbb4b-6bc7-425b-96fc-9cdd49e101e7/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 630.891511] env[67893]: DEBUG nova.network.neutron [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Successfully created port: 4b6f4074-f5df-4e53-a29d-c06d78ea3ec6 {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 632.990806] env[67893]: DEBUG nova.network.neutron [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Successfully created port: 97b5640a-9b72-4426-a6bc-8b1b5ab5a7f6 {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 634.645617] env[67893]: DEBUG nova.compute.manager [req-41d789e0-9ae4-4ff8-9625-d1ba5d09269e req-95d282be-36c1-4d2e-9919-ae46a7347473 service nova] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Received event network-changed-45705223-1cd2-4770-b083-9cd313b2aa10 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 634.645890] env[67893]: DEBUG nova.compute.manager [req-41d789e0-9ae4-4ff8-9625-d1ba5d09269e req-95d282be-36c1-4d2e-9919-ae46a7347473 service nova] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Refreshing instance network info cache due to event network-changed-45705223-1cd2-4770-b083-9cd313b2aa10. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 634.646052] env[67893]: DEBUG oslo_concurrency.lockutils [req-41d789e0-9ae4-4ff8-9625-d1ba5d09269e req-95d282be-36c1-4d2e-9919-ae46a7347473 service nova] Acquiring lock "refresh_cache-0d074bfa-7d3d-4e69-b544-36e7d9f79483" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 634.646235] env[67893]: DEBUG oslo_concurrency.lockutils [req-41d789e0-9ae4-4ff8-9625-d1ba5d09269e req-95d282be-36c1-4d2e-9919-ae46a7347473 service nova] Acquired lock "refresh_cache-0d074bfa-7d3d-4e69-b544-36e7d9f79483" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 634.646518] env[67893]: DEBUG nova.network.neutron [req-41d789e0-9ae4-4ff8-9625-d1ba5d09269e req-95d282be-36c1-4d2e-9919-ae46a7347473 service nova] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Refreshing network info cache for port 45705223-1cd2-4770-b083-9cd313b2aa10 {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 634.760300] env[67893]: DEBUG nova.network.neutron [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Successfully updated port: 88a3ec34-f85d-4b1a-8407-d70197c799a7 {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 634.777366] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Acquiring lock "refresh_cache-30d52736-4195-4767-89e0-8572dc96de29" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 634.777513] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Acquired lock "refresh_cache-30d52736-4195-4767-89e0-8572dc96de29" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 634.777637] env[67893]: DEBUG nova.network.neutron [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 634.960817] env[67893]: DEBUG nova.network.neutron [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 635.740539] env[67893]: DEBUG nova.network.neutron [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Successfully updated port: 4b6f4074-f5df-4e53-a29d-c06d78ea3ec6 {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 635.758906] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Acquiring lock "refresh_cache-d9e47a83-7921-4cf6-ba99-fb705bc52e4a" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 635.758906] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Acquired lock "refresh_cache-d9e47a83-7921-4cf6-ba99-fb705bc52e4a" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 635.758906] env[67893]: DEBUG nova.network.neutron [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 635.920711] env[67893]: DEBUG nova.network.neutron [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 636.075345] env[67893]: DEBUG nova.network.neutron [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Updating instance_info_cache with network_info: [{"id": "88a3ec34-f85d-4b1a-8407-d70197c799a7", "address": "fa:16:3e:4f:b5:e7", "network": {"id": "03c86b83-0a64-4a10-b435-29afa00ff204", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-12926787-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "71be834fce6a435ca74bd0b99b2ee3df", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3ff3baee-99ce-4b51-ae98-efc6163aaab3", "external-id": "nsx-vlan-transportzone-574", "segmentation_id": 574, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap88a3ec34-f8", "ovs_interfaceid": "88a3ec34-f85d-4b1a-8407-d70197c799a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 636.087188] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Releasing lock "refresh_cache-30d52736-4195-4767-89e0-8572dc96de29" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 636.087459] env[67893]: DEBUG nova.compute.manager [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Instance network_info: |[{"id": "88a3ec34-f85d-4b1a-8407-d70197c799a7", "address": "fa:16:3e:4f:b5:e7", "network": {"id": "03c86b83-0a64-4a10-b435-29afa00ff204", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-12926787-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "71be834fce6a435ca74bd0b99b2ee3df", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3ff3baee-99ce-4b51-ae98-efc6163aaab3", "external-id": "nsx-vlan-transportzone-574", "segmentation_id": 574, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap88a3ec34-f8", "ovs_interfaceid": "88a3ec34-f85d-4b1a-8407-d70197c799a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 636.088158] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:4f:b5:e7', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3ff3baee-99ce-4b51-ae98-efc6163aaab3', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '88a3ec34-f85d-4b1a-8407-d70197c799a7', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 636.103687] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Creating folder: Project (71be834fce6a435ca74bd0b99b2ee3df). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 636.103797] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0a2b6fb9-6c43-4714-9923-6e5de6df28a2 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 636.116917] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Created folder: Project (71be834fce6a435ca74bd0b99b2ee3df) in parent group-v689771. [ 636.117129] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Creating folder: Instances. Parent ref: group-v689775. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 636.118320] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-58d3c722-e795-4976-a287-0f15a81c63f1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 636.130318] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Created folder: Instances in parent group-v689775. [ 636.130419] env[67893]: DEBUG oslo.service.loopingcall [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 636.130547] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 636.131065] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e5a7769d-f1fd-439f-845e-b519c6c1aec4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 636.153898] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 636.153898] env[67893]: value = "task-3455292" [ 636.153898] env[67893]: _type = "Task" [ 636.153898] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 636.165571] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455292, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 636.667482] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455292, 'name': CreateVM_Task, 'duration_secs': 0.3361} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 636.667781] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 636.668316] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 636.668478] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 636.668791] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 636.669065] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3e205e55-dca8-4dbd-9d73-3315b529121a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 636.673574] env[67893]: DEBUG oslo_vmware.api [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Waiting for the task: (returnval){ [ 636.673574] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52fe94f5-73f2-e9e6-e0b9-f9a38e03d48d" [ 636.673574] env[67893]: _type = "Task" [ 636.673574] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 636.682369] env[67893]: DEBUG oslo_vmware.api [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52fe94f5-73f2-e9e6-e0b9-f9a38e03d48d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 636.860099] env[67893]: DEBUG nova.network.neutron [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Updating instance_info_cache with network_info: [{"id": "4b6f4074-f5df-4e53-a29d-c06d78ea3ec6", "address": "fa:16:3e:40:fc:04", "network": {"id": "0158027f-656f-43e3-aebf-7cbb75cfd948", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.117", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "304a5519bb7c46efb34a42749d9cf409", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a8b99a46-3e7f-4ef1-9e45-58e6cd17f210", "external-id": "nsx-vlan-transportzone-704", "segmentation_id": 704, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4b6f4074-f5", "ovs_interfaceid": "4b6f4074-f5df-4e53-a29d-c06d78ea3ec6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 636.866815] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 636.867145] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 636.867407] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 636.867553] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 636.876971] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Releasing lock "refresh_cache-d9e47a83-7921-4cf6-ba99-fb705bc52e4a" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 636.877313] env[67893]: DEBUG nova.compute.manager [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Instance network_info: |[{"id": "4b6f4074-f5df-4e53-a29d-c06d78ea3ec6", "address": "fa:16:3e:40:fc:04", "network": {"id": "0158027f-656f-43e3-aebf-7cbb75cfd948", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.117", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "304a5519bb7c46efb34a42749d9cf409", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a8b99a46-3e7f-4ef1-9e45-58e6cd17f210", "external-id": "nsx-vlan-transportzone-704", "segmentation_id": 704, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4b6f4074-f5", "ovs_interfaceid": "4b6f4074-f5df-4e53-a29d-c06d78ea3ec6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 636.877692] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:40:fc:04', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a8b99a46-3e7f-4ef1-9e45-58e6cd17f210', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '4b6f4074-f5df-4e53-a29d-c06d78ea3ec6', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 636.890279] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Creating folder: Project (60b875938b85447ea8a3a60b7be31f2c). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 636.891048] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-48d368d0-0383-415f-a2b0-2fbeae2579ae {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 636.901201] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 636.902566] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 636.902566] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 636.902566] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 636.902566] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 636.902566] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 636.903210] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 636.903297] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 636.903544] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 636.903914] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 636.906305] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 636.906305] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 636.906734] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Created folder: Project (60b875938b85447ea8a3a60b7be31f2c) in parent group-v689771. [ 636.906904] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Creating folder: Instances. Parent ref: group-v689778. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 636.907229] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 636.908490] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f709d38a-fda3-4ee0-914a-7df76a204cda {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 636.921485] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Created folder: Instances in parent group-v689778. [ 636.921485] env[67893]: DEBUG oslo.service.loopingcall [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 636.921485] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 636.921485] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f3e1db76-3863-44b1-b897-c7ae42491e3a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 636.939198] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 636.939388] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 636.939557] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 636.939751] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 636.941298] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a830a3a-7cd7-4837-abd7-3445781015ff {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 636.946446] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 636.946446] env[67893]: value = "task-3455295" [ 636.946446] env[67893]: _type = "Task" [ 636.946446] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 636.960114] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455295, 'name': CreateVM_Task} progress is 6%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 636.967023] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-126e1d56-141f-43f3-b5f9-e9450e4d718e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 636.984130] env[67893]: DEBUG nova.network.neutron [req-41d789e0-9ae4-4ff8-9625-d1ba5d09269e req-95d282be-36c1-4d2e-9919-ae46a7347473 service nova] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Updated VIF entry in instance network info cache for port 45705223-1cd2-4770-b083-9cd313b2aa10. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 636.984772] env[67893]: DEBUG nova.network.neutron [req-41d789e0-9ae4-4ff8-9625-d1ba5d09269e req-95d282be-36c1-4d2e-9919-ae46a7347473 service nova] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Updating instance_info_cache with network_info: [{"id": "45705223-1cd2-4770-b083-9cd313b2aa10", "address": "fa:16:3e:cd:28:cf", "network": {"id": "0158027f-656f-43e3-aebf-7cbb75cfd948", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.207", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "304a5519bb7c46efb34a42749d9cf409", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a8b99a46-3e7f-4ef1-9e45-58e6cd17f210", "external-id": "nsx-vlan-transportzone-704", "segmentation_id": 704, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap45705223-1c", "ovs_interfaceid": "45705223-1cd2-4770-b083-9cd313b2aa10", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 636.986626] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-932ea266-60e1-45b1-bb91-aa7d56a29f75 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 636.994812] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd7cfd8f-44b2-4574-b632-dbe7750f8be7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 637.002422] env[67893]: DEBUG oslo_concurrency.lockutils [req-41d789e0-9ae4-4ff8-9625-d1ba5d09269e req-95d282be-36c1-4d2e-9919-ae46a7347473 service nova] Releasing lock "refresh_cache-0d074bfa-7d3d-4e69-b544-36e7d9f79483" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 637.030386] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180955MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 637.030590] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 637.030827] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 637.120661] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 0d074bfa-7d3d-4e69-b544-36e7d9f79483 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 637.120661] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance d9e47a83-7921-4cf6-ba99-fb705bc52e4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 637.120661] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 30d52736-4195-4767-89e0-8572dc96de29 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 637.120661] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 637.121069] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 4 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 637.121069] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1024MB phys_disk=200GB used_disk=4GB total_vcpus=48 used_vcpus=4 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 637.125550] env[67893]: DEBUG nova.compute.manager [req-5b477785-a82b-49e3-acbf-b67c2d964526 req-ec0d391f-d249-4088-ac1b-58fd417fa47c service nova] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Received event network-vif-plugged-88a3ec34-f85d-4b1a-8407-d70197c799a7 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 637.125550] env[67893]: DEBUG oslo_concurrency.lockutils [req-5b477785-a82b-49e3-acbf-b67c2d964526 req-ec0d391f-d249-4088-ac1b-58fd417fa47c service nova] Acquiring lock "30d52736-4195-4767-89e0-8572dc96de29-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 637.126594] env[67893]: DEBUG oslo_concurrency.lockutils [req-5b477785-a82b-49e3-acbf-b67c2d964526 req-ec0d391f-d249-4088-ac1b-58fd417fa47c service nova] Lock "30d52736-4195-4767-89e0-8572dc96de29-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 637.126594] env[67893]: DEBUG oslo_concurrency.lockutils [req-5b477785-a82b-49e3-acbf-b67c2d964526 req-ec0d391f-d249-4088-ac1b-58fd417fa47c service nova] Lock "30d52736-4195-4767-89e0-8572dc96de29-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 637.126594] env[67893]: DEBUG nova.compute.manager [req-5b477785-a82b-49e3-acbf-b67c2d964526 req-ec0d391f-d249-4088-ac1b-58fd417fa47c service nova] [instance: 30d52736-4195-4767-89e0-8572dc96de29] No waiting events found dispatching network-vif-plugged-88a3ec34-f85d-4b1a-8407-d70197c799a7 {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 637.126594] env[67893]: WARNING nova.compute.manager [req-5b477785-a82b-49e3-acbf-b67c2d964526 req-ec0d391f-d249-4088-ac1b-58fd417fa47c service nova] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Received unexpected event network-vif-plugged-88a3ec34-f85d-4b1a-8407-d70197c799a7 for instance with vm_state building and task_state spawning. [ 637.184641] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 637.184766] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 637.186040] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 637.228160] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cffb5333-2b16-48b5-89f6-d6285ad0c13c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 637.240025] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96ab6b7c-cee5-4a18-b9f3-97e882b0b453 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 637.274213] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c7174de-32e3-40a0-b7e6-555a75ab0a61 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 637.284728] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc29b2f0-8fdd-421a-9b33-5b01fa5e0982 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 637.297856] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 637.312179] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 637.328778] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 637.329134] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.298s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 637.409039] env[67893]: DEBUG oslo_concurrency.lockutils [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Acquiring lock "61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 637.409039] env[67893]: DEBUG oslo_concurrency.lockutils [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Lock "61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 637.423385] env[67893]: DEBUG nova.compute.manager [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 637.464358] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455295, 'name': CreateVM_Task, 'duration_secs': 0.355266} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 637.468637] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 637.468637] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 637.468637] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 637.468637] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 637.468637] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5c35272a-e7d0-4152-93b4-7d19b6485b35 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 637.475277] env[67893]: DEBUG oslo_vmware.api [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Waiting for the task: (returnval){ [ 637.475277] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52935575-366f-aeb9-e773-9cce55b046c6" [ 637.475277] env[67893]: _type = "Task" [ 637.475277] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 637.483910] env[67893]: DEBUG oslo_vmware.api [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52935575-366f-aeb9-e773-9cce55b046c6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 637.510411] env[67893]: DEBUG oslo_concurrency.lockutils [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 637.510659] env[67893]: DEBUG oslo_concurrency.lockutils [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 637.512641] env[67893]: INFO nova.compute.claims [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 637.682745] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93622f60-9273-4981-9a0f-0bd31e46f4ab {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 637.691045] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8e3e057-c6ed-4889-8c89-3b8697f28d05 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 637.721530] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-292783b3-860d-412d-ab6e-1b01c87d0348 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 637.729420] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a49f478a-6162-4bf7-a91d-bd0321a68036 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 637.744923] env[67893]: DEBUG nova.compute.provider_tree [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 637.761595] env[67893]: DEBUG nova.scheduler.client.report [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 637.785947] env[67893]: DEBUG oslo_concurrency.lockutils [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.275s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 637.786491] env[67893]: DEBUG nova.compute.manager [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 637.845797] env[67893]: DEBUG nova.compute.utils [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 637.847186] env[67893]: DEBUG nova.compute.manager [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Not allocating networking since 'none' was specified. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 637.866375] env[67893]: DEBUG nova.compute.manager [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 637.991857] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 637.991857] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 637.991857] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 637.992723] env[67893]: DEBUG nova.compute.manager [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 638.028491] env[67893]: DEBUG nova.virt.hardware [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 638.030127] env[67893]: DEBUG nova.virt.hardware [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 638.030127] env[67893]: DEBUG nova.virt.hardware [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 638.030127] env[67893]: DEBUG nova.virt.hardware [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 638.030127] env[67893]: DEBUG nova.virt.hardware [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 638.030127] env[67893]: DEBUG nova.virt.hardware [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 638.030320] env[67893]: DEBUG nova.virt.hardware [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 638.030320] env[67893]: DEBUG nova.virt.hardware [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 638.030320] env[67893]: DEBUG nova.virt.hardware [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 638.030320] env[67893]: DEBUG nova.virt.hardware [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 638.030441] env[67893]: DEBUG nova.virt.hardware [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 638.031548] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5dae565-2b4b-49e6-9fb9-bcf9ba70fd37 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.039473] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e475c9e-9e69-4d88-8199-5a747a891a30 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.054221] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Instance VIF info [] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 638.060147] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Creating folder: Project (3b4ac1c229b94693ab591aed26c31265). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 638.061586] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1b98351e-aeda-4494-9b24-b1b15ecc2f2b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.073508] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Created folder: Project (3b4ac1c229b94693ab591aed26c31265) in parent group-v689771. [ 638.073707] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Creating folder: Instances. Parent ref: group-v689781. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 638.073939] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2e5c1cb4-6f70-4812-8fab-afcd408edade {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.083133] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Created folder: Instances in parent group-v689781. [ 638.083340] env[67893]: DEBUG oslo.service.loopingcall [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 638.083522] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 638.083845] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-85227643-ae3b-4613-8d48-a29a7faf9121 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.102978] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 638.102978] env[67893]: value = "task-3455298" [ 638.102978] env[67893]: _type = "Task" [ 638.102978] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 638.113064] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455298, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 638.277179] env[67893]: DEBUG nova.network.neutron [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Successfully updated port: 97b5640a-9b72-4426-a6bc-8b1b5ab5a7f6 {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 638.303204] env[67893]: DEBUG oslo_concurrency.lockutils [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Acquiring lock "refresh_cache-3e67c74f-5c03-4dc4-a23b-b547bfb32b4a" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 638.303204] env[67893]: DEBUG oslo_concurrency.lockutils [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Acquired lock "refresh_cache-3e67c74f-5c03-4dc4-a23b-b547bfb32b4a" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 638.303204] env[67893]: DEBUG nova.network.neutron [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 638.450175] env[67893]: DEBUG nova.network.neutron [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 638.616858] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455298, 'name': CreateVM_Task, 'duration_secs': 0.30538} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 638.617953] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 638.617953] env[67893]: DEBUG oslo_concurrency.lockutils [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 638.617953] env[67893]: DEBUG oslo_concurrency.lockutils [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 638.618414] env[67893]: DEBUG oslo_concurrency.lockutils [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 638.618485] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1b7230ca-f6f0-4741-b46b-f3491d241e1e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.627486] env[67893]: DEBUG oslo_vmware.api [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Waiting for the task: (returnval){ [ 638.627486] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]5251632f-005f-96bb-2106-85a2548697b7" [ 638.627486] env[67893]: _type = "Task" [ 638.627486] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 638.637652] env[67893]: DEBUG oslo_vmware.api [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]5251632f-005f-96bb-2106-85a2548697b7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 639.141016] env[67893]: DEBUG oslo_concurrency.lockutils [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 639.143262] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 639.143262] env[67893]: DEBUG oslo_concurrency.lockutils [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 639.193127] env[67893]: DEBUG nova.network.neutron [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Updating instance_info_cache with network_info: [{"id": "97b5640a-9b72-4426-a6bc-8b1b5ab5a7f6", "address": "fa:16:3e:17:e7:a9", "network": {"id": "0158027f-656f-43e3-aebf-7cbb75cfd948", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.187", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "304a5519bb7c46efb34a42749d9cf409", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a8b99a46-3e7f-4ef1-9e45-58e6cd17f210", "external-id": "nsx-vlan-transportzone-704", "segmentation_id": 704, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap97b5640a-9b", "ovs_interfaceid": "97b5640a-9b72-4426-a6bc-8b1b5ab5a7f6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 639.210495] env[67893]: DEBUG oslo_concurrency.lockutils [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Releasing lock "refresh_cache-3e67c74f-5c03-4dc4-a23b-b547bfb32b4a" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 639.211873] env[67893]: DEBUG nova.compute.manager [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Instance network_info: |[{"id": "97b5640a-9b72-4426-a6bc-8b1b5ab5a7f6", "address": "fa:16:3e:17:e7:a9", "network": {"id": "0158027f-656f-43e3-aebf-7cbb75cfd948", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.187", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "304a5519bb7c46efb34a42749d9cf409", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a8b99a46-3e7f-4ef1-9e45-58e6cd17f210", "external-id": "nsx-vlan-transportzone-704", "segmentation_id": 704, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap97b5640a-9b", "ovs_interfaceid": "97b5640a-9b72-4426-a6bc-8b1b5ab5a7f6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 639.212032] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:17:e7:a9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a8b99a46-3e7f-4ef1-9e45-58e6cd17f210', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '97b5640a-9b72-4426-a6bc-8b1b5ab5a7f6', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 639.227519] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Creating folder: Project (78817a6a194f4038a2d8c6bdac194466). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 639.230098] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4cce616f-d2e5-44c5-adea-0af924e0294a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 639.245227] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Created folder: Project (78817a6a194f4038a2d8c6bdac194466) in parent group-v689771. [ 639.245434] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Creating folder: Instances. Parent ref: group-v689784. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 639.245681] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-74397361-9aa9-4f0d-b767-44be8bc906d3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 639.258476] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Created folder: Instances in parent group-v689784. [ 639.258716] env[67893]: DEBUG oslo.service.loopingcall [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 639.258984] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 639.259240] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-107d817b-e31c-4374-a724-fb4291e900eb {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 639.282748] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 639.282748] env[67893]: value = "task-3455301" [ 639.282748] env[67893]: _type = "Task" [ 639.282748] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 639.291268] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455301, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 639.346942] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Acquiring lock "043c631c-bf15-4b4c-9a92-49ea51b6d405" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 639.347257] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Lock "043c631c-bf15-4b4c-9a92-49ea51b6d405" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 639.365076] env[67893]: DEBUG nova.compute.manager [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 639.461684] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 639.462141] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 639.463815] env[67893]: INFO nova.compute.claims [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 639.707033] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9211b77a-1069-4c03-be0b-f20a41fe67a8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 639.717330] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25907ff5-e928-4455-bdfe-f364eb7e84cf {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 639.749130] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d5e42af-ff4a-4afc-9ce0-d4e2ac07b612 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 639.760827] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b766567-f20a-4541-accc-822108bbcfe0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 639.777143] env[67893]: DEBUG nova.compute.provider_tree [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 639.790189] env[67893]: DEBUG nova.scheduler.client.report [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 639.801446] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455301, 'name': CreateVM_Task, 'duration_secs': 0.304032} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 639.801446] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 639.804107] env[67893]: DEBUG oslo_concurrency.lockutils [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 639.804277] env[67893]: DEBUG oslo_concurrency.lockutils [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 639.804859] env[67893]: DEBUG oslo_concurrency.lockutils [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 639.805410] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b1f7f8ea-aa43-4575-a4e2-95b26f145939 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 639.811392] env[67893]: DEBUG oslo_vmware.api [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Waiting for the task: (returnval){ [ 639.811392] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]529b7948-ba27-85a6-38ae-d5c2d732392c" [ 639.811392] env[67893]: _type = "Task" [ 639.811392] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 639.817810] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.356s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 639.818395] env[67893]: DEBUG nova.compute.manager [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 639.825688] env[67893]: DEBUG oslo_vmware.api [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]529b7948-ba27-85a6-38ae-d5c2d732392c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 639.892123] env[67893]: DEBUG nova.compute.utils [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 639.892585] env[67893]: DEBUG nova.compute.manager [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 639.898608] env[67893]: DEBUG nova.network.neutron [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 639.917196] env[67893]: DEBUG nova.compute.manager [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 640.018270] env[67893]: DEBUG nova.compute.manager [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 640.021518] env[67893]: DEBUG nova.policy [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '26088b2228344b5bad914f5781f68a0d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bc6e946192b540f0866e979984dc8fe6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 640.027170] env[67893]: DEBUG nova.compute.manager [req-1f23287f-0444-4de8-9305-971e86123245 req-6f1e0d4a-d7cd-4c2e-b4bd-4869dfb949fd service nova] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Received event network-vif-plugged-97b5640a-9b72-4426-a6bc-8b1b5ab5a7f6 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 640.027368] env[67893]: DEBUG oslo_concurrency.lockutils [req-1f23287f-0444-4de8-9305-971e86123245 req-6f1e0d4a-d7cd-4c2e-b4bd-4869dfb949fd service nova] Acquiring lock "3e67c74f-5c03-4dc4-a23b-b547bfb32b4a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 640.028068] env[67893]: DEBUG oslo_concurrency.lockutils [req-1f23287f-0444-4de8-9305-971e86123245 req-6f1e0d4a-d7cd-4c2e-b4bd-4869dfb949fd service nova] Lock "3e67c74f-5c03-4dc4-a23b-b547bfb32b4a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 640.028281] env[67893]: DEBUG oslo_concurrency.lockutils [req-1f23287f-0444-4de8-9305-971e86123245 req-6f1e0d4a-d7cd-4c2e-b4bd-4869dfb949fd service nova] Lock "3e67c74f-5c03-4dc4-a23b-b547bfb32b4a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 640.028453] env[67893]: DEBUG nova.compute.manager [req-1f23287f-0444-4de8-9305-971e86123245 req-6f1e0d4a-d7cd-4c2e-b4bd-4869dfb949fd service nova] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] No waiting events found dispatching network-vif-plugged-97b5640a-9b72-4426-a6bc-8b1b5ab5a7f6 {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 640.028616] env[67893]: WARNING nova.compute.manager [req-1f23287f-0444-4de8-9305-971e86123245 req-6f1e0d4a-d7cd-4c2e-b4bd-4869dfb949fd service nova] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Received unexpected event network-vif-plugged-97b5640a-9b72-4426-a6bc-8b1b5ab5a7f6 for instance with vm_state building and task_state spawning. [ 640.047460] env[67893]: DEBUG nova.virt.hardware [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 640.047756] env[67893]: DEBUG nova.virt.hardware [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 640.047860] env[67893]: DEBUG nova.virt.hardware [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 640.048302] env[67893]: DEBUG nova.virt.hardware [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 640.048499] env[67893]: DEBUG nova.virt.hardware [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 640.048648] env[67893]: DEBUG nova.virt.hardware [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 640.048857] env[67893]: DEBUG nova.virt.hardware [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 640.049037] env[67893]: DEBUG nova.virt.hardware [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 640.049207] env[67893]: DEBUG nova.virt.hardware [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 640.049363] env[67893]: DEBUG nova.virt.hardware [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 640.049526] env[67893]: DEBUG nova.virt.hardware [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 640.050701] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b994857c-d0c1-4234-a564-010ab588be24 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.059647] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5aa2e0fb-21fe-468e-8274-579de97619ba {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.324160] env[67893]: DEBUG oslo_concurrency.lockutils [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 640.324470] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 640.325034] env[67893]: DEBUG oslo_concurrency.lockutils [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 640.616530] env[67893]: DEBUG nova.network.neutron [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Successfully created port: c9265da9-f56e-4745-94b4-71b1e8fe10ec {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 641.030198] env[67893]: DEBUG nova.compute.manager [req-c1abd823-ae7f-4593-8c22-360e3bebefc7 req-dc121a26-6132-493c-a890-d01bc05aab0b service nova] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Received event network-vif-plugged-4b6f4074-f5df-4e53-a29d-c06d78ea3ec6 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 641.030694] env[67893]: DEBUG oslo_concurrency.lockutils [req-c1abd823-ae7f-4593-8c22-360e3bebefc7 req-dc121a26-6132-493c-a890-d01bc05aab0b service nova] Acquiring lock "d9e47a83-7921-4cf6-ba99-fb705bc52e4a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 641.031052] env[67893]: DEBUG oslo_concurrency.lockutils [req-c1abd823-ae7f-4593-8c22-360e3bebefc7 req-dc121a26-6132-493c-a890-d01bc05aab0b service nova] Lock "d9e47a83-7921-4cf6-ba99-fb705bc52e4a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 641.031438] env[67893]: DEBUG oslo_concurrency.lockutils [req-c1abd823-ae7f-4593-8c22-360e3bebefc7 req-dc121a26-6132-493c-a890-d01bc05aab0b service nova] Lock "d9e47a83-7921-4cf6-ba99-fb705bc52e4a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 641.031612] env[67893]: DEBUG nova.compute.manager [req-c1abd823-ae7f-4593-8c22-360e3bebefc7 req-dc121a26-6132-493c-a890-d01bc05aab0b service nova] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] No waiting events found dispatching network-vif-plugged-4b6f4074-f5df-4e53-a29d-c06d78ea3ec6 {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 641.031776] env[67893]: WARNING nova.compute.manager [req-c1abd823-ae7f-4593-8c22-360e3bebefc7 req-dc121a26-6132-493c-a890-d01bc05aab0b service nova] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Received unexpected event network-vif-plugged-4b6f4074-f5df-4e53-a29d-c06d78ea3ec6 for instance with vm_state building and task_state spawning. [ 641.031939] env[67893]: DEBUG nova.compute.manager [req-c1abd823-ae7f-4593-8c22-360e3bebefc7 req-dc121a26-6132-493c-a890-d01bc05aab0b service nova] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Received event network-changed-88a3ec34-f85d-4b1a-8407-d70197c799a7 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 641.032101] env[67893]: DEBUG nova.compute.manager [req-c1abd823-ae7f-4593-8c22-360e3bebefc7 req-dc121a26-6132-493c-a890-d01bc05aab0b service nova] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Refreshing instance network info cache due to event network-changed-88a3ec34-f85d-4b1a-8407-d70197c799a7. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 641.032280] env[67893]: DEBUG oslo_concurrency.lockutils [req-c1abd823-ae7f-4593-8c22-360e3bebefc7 req-dc121a26-6132-493c-a890-d01bc05aab0b service nova] Acquiring lock "refresh_cache-30d52736-4195-4767-89e0-8572dc96de29" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 641.032407] env[67893]: DEBUG oslo_concurrency.lockutils [req-c1abd823-ae7f-4593-8c22-360e3bebefc7 req-dc121a26-6132-493c-a890-d01bc05aab0b service nova] Acquired lock "refresh_cache-30d52736-4195-4767-89e0-8572dc96de29" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 641.032654] env[67893]: DEBUG nova.network.neutron [req-c1abd823-ae7f-4593-8c22-360e3bebefc7 req-dc121a26-6132-493c-a890-d01bc05aab0b service nova] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Refreshing network info cache for port 88a3ec34-f85d-4b1a-8407-d70197c799a7 {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 642.118475] env[67893]: DEBUG nova.network.neutron [req-c1abd823-ae7f-4593-8c22-360e3bebefc7 req-dc121a26-6132-493c-a890-d01bc05aab0b service nova] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Updated VIF entry in instance network info cache for port 88a3ec34-f85d-4b1a-8407-d70197c799a7. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 642.118848] env[67893]: DEBUG nova.network.neutron [req-c1abd823-ae7f-4593-8c22-360e3bebefc7 req-dc121a26-6132-493c-a890-d01bc05aab0b service nova] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Updating instance_info_cache with network_info: [{"id": "88a3ec34-f85d-4b1a-8407-d70197c799a7", "address": "fa:16:3e:4f:b5:e7", "network": {"id": "03c86b83-0a64-4a10-b435-29afa00ff204", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-12926787-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "71be834fce6a435ca74bd0b99b2ee3df", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3ff3baee-99ce-4b51-ae98-efc6163aaab3", "external-id": "nsx-vlan-transportzone-574", "segmentation_id": 574, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap88a3ec34-f8", "ovs_interfaceid": "88a3ec34-f85d-4b1a-8407-d70197c799a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 642.131973] env[67893]: DEBUG oslo_concurrency.lockutils [req-c1abd823-ae7f-4593-8c22-360e3bebefc7 req-dc121a26-6132-493c-a890-d01bc05aab0b service nova] Releasing lock "refresh_cache-30d52736-4195-4767-89e0-8572dc96de29" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 642.131973] env[67893]: DEBUG nova.compute.manager [req-c1abd823-ae7f-4593-8c22-360e3bebefc7 req-dc121a26-6132-493c-a890-d01bc05aab0b service nova] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Received event network-changed-4b6f4074-f5df-4e53-a29d-c06d78ea3ec6 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 642.131973] env[67893]: DEBUG nova.compute.manager [req-c1abd823-ae7f-4593-8c22-360e3bebefc7 req-dc121a26-6132-493c-a890-d01bc05aab0b service nova] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Refreshing instance network info cache due to event network-changed-4b6f4074-f5df-4e53-a29d-c06d78ea3ec6. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 642.131973] env[67893]: DEBUG oslo_concurrency.lockutils [req-c1abd823-ae7f-4593-8c22-360e3bebefc7 req-dc121a26-6132-493c-a890-d01bc05aab0b service nova] Acquiring lock "refresh_cache-d9e47a83-7921-4cf6-ba99-fb705bc52e4a" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 642.131973] env[67893]: DEBUG oslo_concurrency.lockutils [req-c1abd823-ae7f-4593-8c22-360e3bebefc7 req-dc121a26-6132-493c-a890-d01bc05aab0b service nova] Acquired lock "refresh_cache-d9e47a83-7921-4cf6-ba99-fb705bc52e4a" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 642.132232] env[67893]: DEBUG nova.network.neutron [req-c1abd823-ae7f-4593-8c22-360e3bebefc7 req-dc121a26-6132-493c-a890-d01bc05aab0b service nova] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Refreshing network info cache for port 4b6f4074-f5df-4e53-a29d-c06d78ea3ec6 {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 643.396187] env[67893]: DEBUG nova.network.neutron [req-c1abd823-ae7f-4593-8c22-360e3bebefc7 req-dc121a26-6132-493c-a890-d01bc05aab0b service nova] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Updated VIF entry in instance network info cache for port 4b6f4074-f5df-4e53-a29d-c06d78ea3ec6. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 643.396850] env[67893]: DEBUG nova.network.neutron [req-c1abd823-ae7f-4593-8c22-360e3bebefc7 req-dc121a26-6132-493c-a890-d01bc05aab0b service nova] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Updating instance_info_cache with network_info: [{"id": "4b6f4074-f5df-4e53-a29d-c06d78ea3ec6", "address": "fa:16:3e:40:fc:04", "network": {"id": "0158027f-656f-43e3-aebf-7cbb75cfd948", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.117", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "304a5519bb7c46efb34a42749d9cf409", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a8b99a46-3e7f-4ef1-9e45-58e6cd17f210", "external-id": "nsx-vlan-transportzone-704", "segmentation_id": 704, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4b6f4074-f5", "ovs_interfaceid": "4b6f4074-f5df-4e53-a29d-c06d78ea3ec6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 643.402092] env[67893]: DEBUG nova.network.neutron [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Successfully updated port: c9265da9-f56e-4745-94b4-71b1e8fe10ec {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 643.413970] env[67893]: DEBUG oslo_concurrency.lockutils [req-c1abd823-ae7f-4593-8c22-360e3bebefc7 req-dc121a26-6132-493c-a890-d01bc05aab0b service nova] Releasing lock "refresh_cache-d9e47a83-7921-4cf6-ba99-fb705bc52e4a" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 643.420280] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Acquiring lock "refresh_cache-043c631c-bf15-4b4c-9a92-49ea51b6d405" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 643.420439] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Acquired lock "refresh_cache-043c631c-bf15-4b4c-9a92-49ea51b6d405" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 643.420590] env[67893]: DEBUG nova.network.neutron [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 643.503120] env[67893]: DEBUG nova.network.neutron [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 643.823625] env[67893]: DEBUG nova.network.neutron [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Updating instance_info_cache with network_info: [{"id": "c9265da9-f56e-4745-94b4-71b1e8fe10ec", "address": "fa:16:3e:9a:ee:dd", "network": {"id": "0158027f-656f-43e3-aebf-7cbb75cfd948", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "304a5519bb7c46efb34a42749d9cf409", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a8b99a46-3e7f-4ef1-9e45-58e6cd17f210", "external-id": "nsx-vlan-transportzone-704", "segmentation_id": 704, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc9265da9-f5", "ovs_interfaceid": "c9265da9-f56e-4745-94b4-71b1e8fe10ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 643.848147] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Releasing lock "refresh_cache-043c631c-bf15-4b4c-9a92-49ea51b6d405" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 643.848634] env[67893]: DEBUG nova.compute.manager [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Instance network_info: |[{"id": "c9265da9-f56e-4745-94b4-71b1e8fe10ec", "address": "fa:16:3e:9a:ee:dd", "network": {"id": "0158027f-656f-43e3-aebf-7cbb75cfd948", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "304a5519bb7c46efb34a42749d9cf409", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a8b99a46-3e7f-4ef1-9e45-58e6cd17f210", "external-id": "nsx-vlan-transportzone-704", "segmentation_id": 704, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc9265da9-f5", "ovs_interfaceid": "c9265da9-f56e-4745-94b4-71b1e8fe10ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 643.849518] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9a:ee:dd', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a8b99a46-3e7f-4ef1-9e45-58e6cd17f210', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c9265da9-f56e-4745-94b4-71b1e8fe10ec', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 643.858948] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Creating folder: Project (bc6e946192b540f0866e979984dc8fe6). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 643.859680] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-738c6f11-0afa-423b-ba09-88e4445a0e9e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 643.871849] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Created folder: Project (bc6e946192b540f0866e979984dc8fe6) in parent group-v689771. [ 643.872070] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Creating folder: Instances. Parent ref: group-v689787. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 643.874125] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-05194a52-478d-47c4-8298-a370c5b7f596 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 643.881767] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Created folder: Instances in parent group-v689787. [ 643.882043] env[67893]: DEBUG oslo.service.loopingcall [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 643.882204] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 643.882409] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e62bc702-af35-42ff-be4f-c2279bdeaafb {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 643.914314] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 643.914314] env[67893]: value = "task-3455304" [ 643.914314] env[67893]: _type = "Task" [ 643.914314] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 643.923750] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455304, 'name': CreateVM_Task} progress is 5%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 644.125154] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Acquiring lock "96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 644.125465] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Lock "96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 644.151737] env[67893]: DEBUG nova.compute.manager [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 644.240862] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 644.241179] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 644.245366] env[67893]: INFO nova.compute.claims [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 644.427778] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455304, 'name': CreateVM_Task, 'duration_secs': 0.328855} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 644.429448] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 644.430945] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 644.430945] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 644.431182] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 644.431496] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9cb3bc13-ada7-462f-b21c-1cd3512587b1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 644.442207] env[67893]: DEBUG oslo_vmware.api [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Waiting for the task: (returnval){ [ 644.442207] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]529c2a35-6797-d361-93f2-1e8da5af64bd" [ 644.442207] env[67893]: _type = "Task" [ 644.442207] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 644.457742] env[67893]: DEBUG oslo_vmware.api [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]529c2a35-6797-d361-93f2-1e8da5af64bd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 644.480904] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dadece8d-6694-4f0d-88ff-671c58544c53 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 644.487063] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d392103-c5df-4d68-9ed1-e80f3cb53938 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 644.522038] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ede22bdb-af16-4ade-bc1c-273902d76f2f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 644.532356] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1519568f-76b0-4316-917b-7f0519fad6c6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 644.549414] env[67893]: DEBUG nova.compute.provider_tree [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 644.571788] env[67893]: DEBUG nova.scheduler.client.report [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 644.591265] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.350s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 644.591767] env[67893]: DEBUG nova.compute.manager [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 644.650140] env[67893]: DEBUG nova.compute.utils [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 644.651420] env[67893]: DEBUG nova.compute.manager [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 644.651878] env[67893]: DEBUG nova.network.neutron [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 644.663580] env[67893]: DEBUG nova.compute.manager [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 644.749867] env[67893]: DEBUG nova.compute.manager [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 644.781865] env[67893]: DEBUG nova.virt.hardware [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 644.782131] env[67893]: DEBUG nova.virt.hardware [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 644.782290] env[67893]: DEBUG nova.virt.hardware [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 644.782469] env[67893]: DEBUG nova.virt.hardware [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 644.783472] env[67893]: DEBUG nova.virt.hardware [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 644.783472] env[67893]: DEBUG nova.virt.hardware [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 644.783472] env[67893]: DEBUG nova.virt.hardware [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 644.783472] env[67893]: DEBUG nova.virt.hardware [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 644.783472] env[67893]: DEBUG nova.virt.hardware [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 644.783776] env[67893]: DEBUG nova.virt.hardware [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 644.783776] env[67893]: DEBUG nova.virt.hardware [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 644.784462] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c831160-601d-45b7-84b3-a54c1f57c69e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 644.793528] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57168aa5-9da8-4041-b16c-11c29342db76 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 644.810131] env[67893]: DEBUG nova.policy [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'afe837e2f7db454ab50e8c90c905dc79', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '003516068deb46448b67dcdb7d11c345', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 644.955595] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 644.955908] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 644.956215] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 645.463636] env[67893]: DEBUG nova.network.neutron [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Successfully created port: 7e5ed9ec-a2a6-4c0a-9486-5c4280904e37 {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 645.976494] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Acquiring lock "2f69fae8-d060-4156-8880-071f5ee1f969" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 645.976755] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Lock "2f69fae8-d060-4156-8880-071f5ee1f969" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 645.991464] env[67893]: DEBUG nova.compute.manager [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 646.058280] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 646.058280] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 646.058280] env[67893]: INFO nova.compute.claims [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 646.264721] env[67893]: DEBUG nova.compute.manager [req-b98024a9-3d46-4753-97a0-fac273f8d7f0 req-2be4da5a-164b-4741-8570-d302eefd11ea service nova] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Received event network-changed-97b5640a-9b72-4426-a6bc-8b1b5ab5a7f6 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 646.265207] env[67893]: DEBUG nova.compute.manager [req-b98024a9-3d46-4753-97a0-fac273f8d7f0 req-2be4da5a-164b-4741-8570-d302eefd11ea service nova] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Refreshing instance network info cache due to event network-changed-97b5640a-9b72-4426-a6bc-8b1b5ab5a7f6. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 646.265299] env[67893]: DEBUG oslo_concurrency.lockutils [req-b98024a9-3d46-4753-97a0-fac273f8d7f0 req-2be4da5a-164b-4741-8570-d302eefd11ea service nova] Acquiring lock "refresh_cache-3e67c74f-5c03-4dc4-a23b-b547bfb32b4a" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 646.265391] env[67893]: DEBUG oslo_concurrency.lockutils [req-b98024a9-3d46-4753-97a0-fac273f8d7f0 req-2be4da5a-164b-4741-8570-d302eefd11ea service nova] Acquired lock "refresh_cache-3e67c74f-5c03-4dc4-a23b-b547bfb32b4a" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 646.265544] env[67893]: DEBUG nova.network.neutron [req-b98024a9-3d46-4753-97a0-fac273f8d7f0 req-2be4da5a-164b-4741-8570-d302eefd11ea service nova] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Refreshing network info cache for port 97b5640a-9b72-4426-a6bc-8b1b5ab5a7f6 {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 646.293306] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-397f023c-0254-4cac-9927-87be4eb47f63 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 646.301429] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21665129-5537-47b5-a0c3-7869cd1a06ca {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 646.337675] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e85c1d6-fd59-40cd-8cea-9f607df004d8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 646.346625] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e8df23e-fd3d-41c8-a7ee-e14511f3b031 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 646.361343] env[67893]: DEBUG nova.compute.provider_tree [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 646.375169] env[67893]: DEBUG nova.scheduler.client.report [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 646.393761] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.337s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 646.394287] env[67893]: DEBUG nova.compute.manager [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 646.453555] env[67893]: DEBUG nova.compute.utils [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 646.459343] env[67893]: DEBUG nova.compute.manager [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 646.459550] env[67893]: DEBUG nova.network.neutron [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 646.491368] env[67893]: DEBUG nova.compute.manager [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 646.638326] env[67893]: DEBUG nova.compute.manager [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 646.684770] env[67893]: DEBUG nova.virt.hardware [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 646.686518] env[67893]: DEBUG nova.virt.hardware [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 646.686518] env[67893]: DEBUG nova.virt.hardware [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 646.686518] env[67893]: DEBUG nova.virt.hardware [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 646.686518] env[67893]: DEBUG nova.virt.hardware [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 646.686518] env[67893]: DEBUG nova.virt.hardware [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 646.686770] env[67893]: DEBUG nova.virt.hardware [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 646.687117] env[67893]: DEBUG nova.virt.hardware [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 646.687968] env[67893]: DEBUG nova.virt.hardware [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 646.688339] env[67893]: DEBUG nova.virt.hardware [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 646.688596] env[67893]: DEBUG nova.virt.hardware [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 646.689967] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b95b5791-91c1-4f56-a50a-e88c99cdb6d4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 646.702190] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d624a4ed-6c4c-4aeb-8801-33e1f90d77f0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 646.718545] env[67893]: DEBUG nova.policy [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6c219a584c1244e4ad19730c503ed74d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4aa620f2e894427a862b1a595c9e019f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 647.173132] env[67893]: DEBUG nova.network.neutron [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Successfully updated port: 7e5ed9ec-a2a6-4c0a-9486-5c4280904e37 {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 647.194168] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Acquiring lock "refresh_cache-96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 647.194323] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Acquired lock "refresh_cache-96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 647.194731] env[67893]: DEBUG nova.network.neutron [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 647.199385] env[67893]: DEBUG nova.network.neutron [req-b98024a9-3d46-4753-97a0-fac273f8d7f0 req-2be4da5a-164b-4741-8570-d302eefd11ea service nova] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Updated VIF entry in instance network info cache for port 97b5640a-9b72-4426-a6bc-8b1b5ab5a7f6. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 647.199385] env[67893]: DEBUG nova.network.neutron [req-b98024a9-3d46-4753-97a0-fac273f8d7f0 req-2be4da5a-164b-4741-8570-d302eefd11ea service nova] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Updating instance_info_cache with network_info: [{"id": "97b5640a-9b72-4426-a6bc-8b1b5ab5a7f6", "address": "fa:16:3e:17:e7:a9", "network": {"id": "0158027f-656f-43e3-aebf-7cbb75cfd948", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.187", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "304a5519bb7c46efb34a42749d9cf409", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a8b99a46-3e7f-4ef1-9e45-58e6cd17f210", "external-id": "nsx-vlan-transportzone-704", "segmentation_id": 704, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap97b5640a-9b", "ovs_interfaceid": "97b5640a-9b72-4426-a6bc-8b1b5ab5a7f6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 647.224529] env[67893]: DEBUG oslo_concurrency.lockutils [req-b98024a9-3d46-4753-97a0-fac273f8d7f0 req-2be4da5a-164b-4741-8570-d302eefd11ea service nova] Releasing lock "refresh_cache-3e67c74f-5c03-4dc4-a23b-b547bfb32b4a" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 647.519426] env[67893]: DEBUG nova.network.neutron [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 647.865264] env[67893]: DEBUG nova.network.neutron [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Successfully created port: d715a161-0823-4255-9b75-f8ec650c03bf {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 648.205983] env[67893]: DEBUG nova.network.neutron [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Updating instance_info_cache with network_info: [{"id": "7e5ed9ec-a2a6-4c0a-9486-5c4280904e37", "address": "fa:16:3e:c3:97:b2", "network": {"id": "f28d4f17-ac87-4b46-aae5-8b77669924a8", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1354805933-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "003516068deb46448b67dcdb7d11c345", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c4d3f69a-b086-4c3b-b976-5a848b63dfc4", "external-id": "nsx-vlan-transportzone-627", "segmentation_id": 627, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7e5ed9ec-a2", "ovs_interfaceid": "7e5ed9ec-a2a6-4c0a-9486-5c4280904e37", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 648.225109] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Releasing lock "refresh_cache-96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 648.225109] env[67893]: DEBUG nova.compute.manager [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Instance network_info: |[{"id": "7e5ed9ec-a2a6-4c0a-9486-5c4280904e37", "address": "fa:16:3e:c3:97:b2", "network": {"id": "f28d4f17-ac87-4b46-aae5-8b77669924a8", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1354805933-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "003516068deb46448b67dcdb7d11c345", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c4d3f69a-b086-4c3b-b976-5a848b63dfc4", "external-id": "nsx-vlan-transportzone-627", "segmentation_id": 627, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7e5ed9ec-a2", "ovs_interfaceid": "7e5ed9ec-a2a6-4c0a-9486-5c4280904e37", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 648.225255] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c3:97:b2', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c4d3f69a-b086-4c3b-b976-5a848b63dfc4', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7e5ed9ec-a2a6-4c0a-9486-5c4280904e37', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 648.236576] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Creating folder: Project (003516068deb46448b67dcdb7d11c345). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 648.237363] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-26ad2c46-eba5-476f-a5b3-9ef2851b35d9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 648.248744] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Created folder: Project (003516068deb46448b67dcdb7d11c345) in parent group-v689771. [ 648.249414] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Creating folder: Instances. Parent ref: group-v689793. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 648.249750] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8e292e57-df7b-44c0-aadd-cfef7b642f0e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 648.260085] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Created folder: Instances in parent group-v689793. [ 648.260085] env[67893]: DEBUG oslo.service.loopingcall [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 648.260085] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 648.260085] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6a8ef861-9ed8-4cb3-9b52-1d3a9c4edd39 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 648.279139] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 648.279139] env[67893]: value = "task-3455312" [ 648.279139] env[67893]: _type = "Task" [ 648.279139] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 648.289591] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455312, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 648.796019] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455312, 'name': CreateVM_Task, 'duration_secs': 0.406113} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 648.796019] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 648.796019] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 648.796019] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 648.796019] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 648.796611] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8e8c8f84-23b8-408c-a2d9-ae4edf719962 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 648.800949] env[67893]: DEBUG oslo_vmware.api [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Waiting for the task: (returnval){ [ 648.800949] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]525ad405-3ccc-6d9d-4517-718013f8e400" [ 648.800949] env[67893]: _type = "Task" [ 648.800949] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 648.809802] env[67893]: DEBUG oslo_vmware.api [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]525ad405-3ccc-6d9d-4517-718013f8e400, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 649.000698] env[67893]: DEBUG oslo_concurrency.lockutils [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Acquiring lock "b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 649.000698] env[67893]: DEBUG oslo_concurrency.lockutils [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Lock "b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 649.023191] env[67893]: DEBUG nova.compute.manager [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 649.095594] env[67893]: DEBUG oslo_concurrency.lockutils [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 649.095594] env[67893]: DEBUG oslo_concurrency.lockutils [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 649.098222] env[67893]: INFO nova.compute.claims [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 649.321920] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 649.322366] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 649.322660] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 649.364310] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b9f4740-2dd3-43a4-869f-4fff1282bb0d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 649.383581] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b933d89-e066-4d10-bf98-9179cbce4a91 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 649.425524] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8405ff82-d1fb-4c73-8d54-bfe5fcd75c3a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 649.434704] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b0f4faa-dc0c-41df-9ac0-2caf3c811629 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 649.977235] env[67893]: DEBUG nova.network.neutron [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Successfully updated port: d715a161-0823-4255-9b75-f8ec650c03bf {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 649.995353] env[67893]: DEBUG nova.compute.provider_tree [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 649.996703] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Acquiring lock "refresh_cache-2f69fae8-d060-4156-8880-071f5ee1f969" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 649.996794] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Acquired lock "refresh_cache-2f69fae8-d060-4156-8880-071f5ee1f969" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 649.998053] env[67893]: DEBUG nova.network.neutron [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 650.009805] env[67893]: DEBUG nova.scheduler.client.report [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 650.029236] env[67893]: DEBUG oslo_concurrency.lockutils [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.933s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 650.029580] env[67893]: DEBUG nova.compute.manager [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 650.062320] env[67893]: DEBUG nova.network.neutron [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 650.084307] env[67893]: DEBUG nova.compute.utils [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 650.085901] env[67893]: DEBUG nova.compute.manager [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 650.086082] env[67893]: DEBUG nova.network.neutron [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 650.109155] env[67893]: DEBUG nova.compute.manager [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 650.249699] env[67893]: DEBUG nova.compute.manager [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 650.290117] env[67893]: DEBUG nova.virt.hardware [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 650.290117] env[67893]: DEBUG nova.virt.hardware [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 650.290117] env[67893]: DEBUG nova.virt.hardware [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 650.290382] env[67893]: DEBUG nova.virt.hardware [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 650.290382] env[67893]: DEBUG nova.virt.hardware [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 650.290382] env[67893]: DEBUG nova.virt.hardware [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 650.290382] env[67893]: DEBUG nova.virt.hardware [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 650.290382] env[67893]: DEBUG nova.virt.hardware [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 650.290567] env[67893]: DEBUG nova.virt.hardware [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 650.290567] env[67893]: DEBUG nova.virt.hardware [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 650.290567] env[67893]: DEBUG nova.virt.hardware [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 650.290928] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-054c407c-e479-460b-88a8-952545ce7f0e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.304817] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cebcabdb-02ab-459c-a559-95f66b662902 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.313698] env[67893]: DEBUG nova.policy [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6066e0e9965141888f6db0cdeb52a2cc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ceeb713fc8334616a1e3d122ad1a6138', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 650.496029] env[67893]: DEBUG oslo_concurrency.lockutils [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Acquiring lock "2256af1c-4ff8-46b9-b568-c25ce8886e5f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 650.500429] env[67893]: DEBUG oslo_concurrency.lockutils [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Lock "2256af1c-4ff8-46b9-b568-c25ce8886e5f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.004s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 650.528339] env[67893]: DEBUG nova.compute.manager [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 650.609625] env[67893]: DEBUG nova.network.neutron [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Updating instance_info_cache with network_info: [{"id": "d715a161-0823-4255-9b75-f8ec650c03bf", "address": "fa:16:3e:db:63:ba", "network": {"id": "313a8b20-43a7-4371-9c86-5bbae9867952", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1884844710-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4aa620f2e894427a862b1a595c9e019f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ed3ffc1d-9f86-4029-857e-6cd1d383edbb", "external-id": "nsx-vlan-transportzone-759", "segmentation_id": 759, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd715a161-08", "ovs_interfaceid": "d715a161-0823-4255-9b75-f8ec650c03bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 650.623025] env[67893]: DEBUG oslo_concurrency.lockutils [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 650.623025] env[67893]: DEBUG oslo_concurrency.lockutils [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 650.623123] env[67893]: INFO nova.compute.claims [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 650.629833] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Releasing lock "refresh_cache-2f69fae8-d060-4156-8880-071f5ee1f969" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 650.630118] env[67893]: DEBUG nova.compute.manager [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Instance network_info: |[{"id": "d715a161-0823-4255-9b75-f8ec650c03bf", "address": "fa:16:3e:db:63:ba", "network": {"id": "313a8b20-43a7-4371-9c86-5bbae9867952", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1884844710-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4aa620f2e894427a862b1a595c9e019f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ed3ffc1d-9f86-4029-857e-6cd1d383edbb", "external-id": "nsx-vlan-transportzone-759", "segmentation_id": 759, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd715a161-08", "ovs_interfaceid": "d715a161-0823-4255-9b75-f8ec650c03bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 650.630486] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:db:63:ba', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ed3ffc1d-9f86-4029-857e-6cd1d383edbb', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd715a161-0823-4255-9b75-f8ec650c03bf', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 650.638063] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Creating folder: Project (4aa620f2e894427a862b1a595c9e019f). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 650.644257] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3d3bc355-32fe-42c3-82db-8ab4f5275f86 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.654348] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Created folder: Project (4aa620f2e894427a862b1a595c9e019f) in parent group-v689771. [ 650.654468] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Creating folder: Instances. Parent ref: group-v689796. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 650.654709] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b5e23889-691f-4fd1-82c5-3f50d2b1ed99 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.663928] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Created folder: Instances in parent group-v689796. [ 650.664240] env[67893]: DEBUG oslo.service.loopingcall [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 650.664350] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 650.664557] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-4ad14910-042f-4a23-98f3-ca770c52be8e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.690225] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 650.690225] env[67893]: value = "task-3455316" [ 650.690225] env[67893]: _type = "Task" [ 650.690225] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 650.700240] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455316, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 650.976495] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4070e138-f962-42f3-a97f-f01a8dcd4817 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.988692] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb898afb-b9d5-4f36-a6c0-0ef631cae712 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.022473] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3f734c9-fae1-4a28-9a13-f01387baef72 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.030524] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2f7b8a4-1082-4e1b-a547-e4bf7d40f1d9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.046977] env[67893]: DEBUG nova.compute.provider_tree [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 651.058917] env[67893]: DEBUG nova.scheduler.client.report [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 651.080201] env[67893]: DEBUG oslo_concurrency.lockutils [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.458s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 651.080719] env[67893]: DEBUG nova.compute.manager [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 651.130651] env[67893]: DEBUG nova.compute.utils [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 651.132748] env[67893]: DEBUG nova.compute.manager [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 651.135238] env[67893]: DEBUG nova.network.neutron [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 651.151140] env[67893]: DEBUG nova.compute.manager [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 651.203732] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455316, 'name': CreateVM_Task, 'duration_secs': 0.367857} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 651.203856] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 651.208881] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 651.208881] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 651.209073] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 651.209593] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-990ea1d9-3f31-4bb7-b0f6-43159f9f654e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.214478] env[67893]: DEBUG oslo_vmware.api [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Waiting for the task: (returnval){ [ 651.214478] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]5249df9b-12bc-1d17-bd1b-1ab7677b5b6e" [ 651.214478] env[67893]: _type = "Task" [ 651.214478] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 651.223063] env[67893]: DEBUG oslo_vmware.api [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]5249df9b-12bc-1d17-bd1b-1ab7677b5b6e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 651.244373] env[67893]: DEBUG nova.compute.manager [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 651.283125] env[67893]: DEBUG nova.virt.hardware [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 651.283125] env[67893]: DEBUG nova.virt.hardware [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 651.283745] env[67893]: DEBUG nova.virt.hardware [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 651.284115] env[67893]: DEBUG nova.virt.hardware [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 651.284719] env[67893]: DEBUG nova.virt.hardware [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 651.284719] env[67893]: DEBUG nova.virt.hardware [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 651.284719] env[67893]: DEBUG nova.virt.hardware [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 651.284936] env[67893]: DEBUG nova.virt.hardware [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 651.284936] env[67893]: DEBUG nova.virt.hardware [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 651.288159] env[67893]: DEBUG nova.virt.hardware [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 651.288159] env[67893]: DEBUG nova.virt.hardware [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 651.288159] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4dcdcba3-6e2c-4c83-a4c3-77cc5557d1d0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.294863] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-043f00ee-dc2d-4a68-bea6-59bdab6faf10 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.361632] env[67893]: DEBUG oslo_concurrency.lockutils [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Acquiring lock "6520080a-8bf1-4803-9099-87c3ba6e28e4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 651.361863] env[67893]: DEBUG oslo_concurrency.lockutils [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Lock "6520080a-8bf1-4803-9099-87c3ba6e28e4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 651.380133] env[67893]: DEBUG nova.network.neutron [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Successfully created port: d6327d09-d00b-409d-ba09-b06bbfe8d034 {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 651.501298] env[67893]: DEBUG nova.policy [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3ab750b56e594db684a09d223661e58c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '95903c6afc524348bee927fd80b17219', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 651.725117] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 651.726692] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 651.726692] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 652.075127] env[67893]: DEBUG nova.compute.manager [req-1a270bf0-41a3-4a0e-a69b-29df28c4b49f req-7aace64e-25f2-4bc7-b633-39ff8e7ef46b service nova] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Received event network-vif-plugged-c9265da9-f56e-4745-94b4-71b1e8fe10ec {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 652.075127] env[67893]: DEBUG oslo_concurrency.lockutils [req-1a270bf0-41a3-4a0e-a69b-29df28c4b49f req-7aace64e-25f2-4bc7-b633-39ff8e7ef46b service nova] Acquiring lock "043c631c-bf15-4b4c-9a92-49ea51b6d405-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 652.075127] env[67893]: DEBUG oslo_concurrency.lockutils [req-1a270bf0-41a3-4a0e-a69b-29df28c4b49f req-7aace64e-25f2-4bc7-b633-39ff8e7ef46b service nova] Lock "043c631c-bf15-4b4c-9a92-49ea51b6d405-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 652.075127] env[67893]: DEBUG oslo_concurrency.lockutils [req-1a270bf0-41a3-4a0e-a69b-29df28c4b49f req-7aace64e-25f2-4bc7-b633-39ff8e7ef46b service nova] Lock "043c631c-bf15-4b4c-9a92-49ea51b6d405-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 652.075592] env[67893]: DEBUG nova.compute.manager [req-1a270bf0-41a3-4a0e-a69b-29df28c4b49f req-7aace64e-25f2-4bc7-b633-39ff8e7ef46b service nova] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] No waiting events found dispatching network-vif-plugged-c9265da9-f56e-4745-94b4-71b1e8fe10ec {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 652.075592] env[67893]: WARNING nova.compute.manager [req-1a270bf0-41a3-4a0e-a69b-29df28c4b49f req-7aace64e-25f2-4bc7-b633-39ff8e7ef46b service nova] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Received unexpected event network-vif-plugged-c9265da9-f56e-4745-94b4-71b1e8fe10ec for instance with vm_state building and task_state spawning. [ 652.075592] env[67893]: DEBUG nova.compute.manager [req-1a270bf0-41a3-4a0e-a69b-29df28c4b49f req-7aace64e-25f2-4bc7-b633-39ff8e7ef46b service nova] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Received event network-changed-c9265da9-f56e-4745-94b4-71b1e8fe10ec {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 652.075733] env[67893]: DEBUG nova.compute.manager [req-1a270bf0-41a3-4a0e-a69b-29df28c4b49f req-7aace64e-25f2-4bc7-b633-39ff8e7ef46b service nova] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Refreshing instance network info cache due to event network-changed-c9265da9-f56e-4745-94b4-71b1e8fe10ec. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 652.076107] env[67893]: DEBUG oslo_concurrency.lockutils [req-1a270bf0-41a3-4a0e-a69b-29df28c4b49f req-7aace64e-25f2-4bc7-b633-39ff8e7ef46b service nova] Acquiring lock "refresh_cache-043c631c-bf15-4b4c-9a92-49ea51b6d405" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 652.076373] env[67893]: DEBUG oslo_concurrency.lockutils [req-1a270bf0-41a3-4a0e-a69b-29df28c4b49f req-7aace64e-25f2-4bc7-b633-39ff8e7ef46b service nova] Acquired lock "refresh_cache-043c631c-bf15-4b4c-9a92-49ea51b6d405" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 652.076714] env[67893]: DEBUG nova.network.neutron [req-1a270bf0-41a3-4a0e-a69b-29df28c4b49f req-7aace64e-25f2-4bc7-b633-39ff8e7ef46b service nova] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Refreshing network info cache for port c9265da9-f56e-4745-94b4-71b1e8fe10ec {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 652.911932] env[67893]: DEBUG nova.network.neutron [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Successfully updated port: d6327d09-d00b-409d-ba09-b06bbfe8d034 {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 652.927656] env[67893]: DEBUG oslo_concurrency.lockutils [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Acquiring lock "refresh_cache-b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 652.927867] env[67893]: DEBUG oslo_concurrency.lockutils [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Acquired lock "refresh_cache-b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 652.928100] env[67893]: DEBUG nova.network.neutron [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 652.987561] env[67893]: DEBUG nova.network.neutron [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Successfully created port: cb1b9bfc-7a09-4c2f-9281-8fd57702ce4c {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 653.050882] env[67893]: DEBUG nova.network.neutron [req-1a270bf0-41a3-4a0e-a69b-29df28c4b49f req-7aace64e-25f2-4bc7-b633-39ff8e7ef46b service nova] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Updated VIF entry in instance network info cache for port c9265da9-f56e-4745-94b4-71b1e8fe10ec. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 653.051324] env[67893]: DEBUG nova.network.neutron [req-1a270bf0-41a3-4a0e-a69b-29df28c4b49f req-7aace64e-25f2-4bc7-b633-39ff8e7ef46b service nova] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Updating instance_info_cache with network_info: [{"id": "c9265da9-f56e-4745-94b4-71b1e8fe10ec", "address": "fa:16:3e:9a:ee:dd", "network": {"id": "0158027f-656f-43e3-aebf-7cbb75cfd948", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "304a5519bb7c46efb34a42749d9cf409", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a8b99a46-3e7f-4ef1-9e45-58e6cd17f210", "external-id": "nsx-vlan-transportzone-704", "segmentation_id": 704, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc9265da9-f5", "ovs_interfaceid": "c9265da9-f56e-4745-94b4-71b1e8fe10ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 653.058532] env[67893]: DEBUG nova.network.neutron [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 653.062981] env[67893]: DEBUG oslo_concurrency.lockutils [req-1a270bf0-41a3-4a0e-a69b-29df28c4b49f req-7aace64e-25f2-4bc7-b633-39ff8e7ef46b service nova] Releasing lock "refresh_cache-043c631c-bf15-4b4c-9a92-49ea51b6d405" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 653.063823] env[67893]: DEBUG nova.compute.manager [req-1a270bf0-41a3-4a0e-a69b-29df28c4b49f req-7aace64e-25f2-4bc7-b633-39ff8e7ef46b service nova] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Received event network-vif-plugged-7e5ed9ec-a2a6-4c0a-9486-5c4280904e37 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 653.063823] env[67893]: DEBUG oslo_concurrency.lockutils [req-1a270bf0-41a3-4a0e-a69b-29df28c4b49f req-7aace64e-25f2-4bc7-b633-39ff8e7ef46b service nova] Acquiring lock "96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 653.063823] env[67893]: DEBUG oslo_concurrency.lockutils [req-1a270bf0-41a3-4a0e-a69b-29df28c4b49f req-7aace64e-25f2-4bc7-b633-39ff8e7ef46b service nova] Lock "96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 653.063823] env[67893]: DEBUG oslo_concurrency.lockutils [req-1a270bf0-41a3-4a0e-a69b-29df28c4b49f req-7aace64e-25f2-4bc7-b633-39ff8e7ef46b service nova] Lock "96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 653.064067] env[67893]: DEBUG nova.compute.manager [req-1a270bf0-41a3-4a0e-a69b-29df28c4b49f req-7aace64e-25f2-4bc7-b633-39ff8e7ef46b service nova] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] No waiting events found dispatching network-vif-plugged-7e5ed9ec-a2a6-4c0a-9486-5c4280904e37 {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 653.064067] env[67893]: WARNING nova.compute.manager [req-1a270bf0-41a3-4a0e-a69b-29df28c4b49f req-7aace64e-25f2-4bc7-b633-39ff8e7ef46b service nova] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Received unexpected event network-vif-plugged-7e5ed9ec-a2a6-4c0a-9486-5c4280904e37 for instance with vm_state building and task_state spawning. [ 653.064161] env[67893]: DEBUG nova.compute.manager [req-1a270bf0-41a3-4a0e-a69b-29df28c4b49f req-7aace64e-25f2-4bc7-b633-39ff8e7ef46b service nova] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Received event network-changed-7e5ed9ec-a2a6-4c0a-9486-5c4280904e37 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 653.064559] env[67893]: DEBUG nova.compute.manager [req-1a270bf0-41a3-4a0e-a69b-29df28c4b49f req-7aace64e-25f2-4bc7-b633-39ff8e7ef46b service nova] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Refreshing instance network info cache due to event network-changed-7e5ed9ec-a2a6-4c0a-9486-5c4280904e37. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 653.064559] env[67893]: DEBUG oslo_concurrency.lockutils [req-1a270bf0-41a3-4a0e-a69b-29df28c4b49f req-7aace64e-25f2-4bc7-b633-39ff8e7ef46b service nova] Acquiring lock "refresh_cache-96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 653.064782] env[67893]: DEBUG oslo_concurrency.lockutils [req-1a270bf0-41a3-4a0e-a69b-29df28c4b49f req-7aace64e-25f2-4bc7-b633-39ff8e7ef46b service nova] Acquired lock "refresh_cache-96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 653.064782] env[67893]: DEBUG nova.network.neutron [req-1a270bf0-41a3-4a0e-a69b-29df28c4b49f req-7aace64e-25f2-4bc7-b633-39ff8e7ef46b service nova] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Refreshing network info cache for port 7e5ed9ec-a2a6-4c0a-9486-5c4280904e37 {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 653.162032] env[67893]: DEBUG nova.compute.manager [req-761a765b-bb79-430b-9bd2-4381a80a825f req-a11aaa4f-6807-4f32-9fa4-cf0e2a7823fa service nova] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Received event network-vif-plugged-d715a161-0823-4255-9b75-f8ec650c03bf {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 653.162032] env[67893]: DEBUG oslo_concurrency.lockutils [req-761a765b-bb79-430b-9bd2-4381a80a825f req-a11aaa4f-6807-4f32-9fa4-cf0e2a7823fa service nova] Acquiring lock "2f69fae8-d060-4156-8880-071f5ee1f969-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 653.162032] env[67893]: DEBUG oslo_concurrency.lockutils [req-761a765b-bb79-430b-9bd2-4381a80a825f req-a11aaa4f-6807-4f32-9fa4-cf0e2a7823fa service nova] Lock "2f69fae8-d060-4156-8880-071f5ee1f969-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 653.162400] env[67893]: DEBUG oslo_concurrency.lockutils [req-761a765b-bb79-430b-9bd2-4381a80a825f req-a11aaa4f-6807-4f32-9fa4-cf0e2a7823fa service nova] Lock "2f69fae8-d060-4156-8880-071f5ee1f969-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 653.162400] env[67893]: DEBUG nova.compute.manager [req-761a765b-bb79-430b-9bd2-4381a80a825f req-a11aaa4f-6807-4f32-9fa4-cf0e2a7823fa service nova] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] No waiting events found dispatching network-vif-plugged-d715a161-0823-4255-9b75-f8ec650c03bf {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 653.163136] env[67893]: WARNING nova.compute.manager [req-761a765b-bb79-430b-9bd2-4381a80a825f req-a11aaa4f-6807-4f32-9fa4-cf0e2a7823fa service nova] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Received unexpected event network-vif-plugged-d715a161-0823-4255-9b75-f8ec650c03bf for instance with vm_state building and task_state spawning. [ 653.898997] env[67893]: DEBUG nova.network.neutron [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Updating instance_info_cache with network_info: [{"id": "d6327d09-d00b-409d-ba09-b06bbfe8d034", "address": "fa:16:3e:61:b8:a7", "network": {"id": "45d43c16-3b98-4fff-a20a-12d1f40f8f39", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082006170-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ceeb713fc8334616a1e3d122ad1a6138", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1712475b-e1be-49e0-9a18-febd305c90ad", "external-id": "nsx-vlan-transportzone-531", "segmentation_id": 531, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd6327d09-d0", "ovs_interfaceid": "d6327d09-d00b-409d-ba09-b06bbfe8d034", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 653.916503] env[67893]: DEBUG oslo_concurrency.lockutils [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Releasing lock "refresh_cache-b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 653.916503] env[67893]: DEBUG nova.compute.manager [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Instance network_info: |[{"id": "d6327d09-d00b-409d-ba09-b06bbfe8d034", "address": "fa:16:3e:61:b8:a7", "network": {"id": "45d43c16-3b98-4fff-a20a-12d1f40f8f39", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082006170-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ceeb713fc8334616a1e3d122ad1a6138", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1712475b-e1be-49e0-9a18-febd305c90ad", "external-id": "nsx-vlan-transportzone-531", "segmentation_id": 531, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd6327d09-d0", "ovs_interfaceid": "d6327d09-d00b-409d-ba09-b06bbfe8d034", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 653.916726] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:61:b8:a7', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '1712475b-e1be-49e0-9a18-febd305c90ad', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd6327d09-d00b-409d-ba09-b06bbfe8d034', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 653.926179] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Creating folder: Project (ceeb713fc8334616a1e3d122ad1a6138). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 653.926179] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-87e78938-8e70-4c23-a3aa-3e04305c2245 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 653.934760] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Created folder: Project (ceeb713fc8334616a1e3d122ad1a6138) in parent group-v689771. [ 653.934973] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Creating folder: Instances. Parent ref: group-v689799. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 653.938433] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7de88db5-2d32-4a48-beb9-e0e55f9bba52 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 653.947278] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Created folder: Instances in parent group-v689799. [ 653.948813] env[67893]: DEBUG oslo.service.loopingcall [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 653.948903] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 653.949159] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-274f52e9-013f-4eff-9bff-2b6ba351c779 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 653.972723] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 653.972723] env[67893]: value = "task-3455321" [ 653.972723] env[67893]: _type = "Task" [ 653.972723] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 653.984914] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455321, 'name': CreateVM_Task} progress is 5%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 654.096543] env[67893]: DEBUG nova.network.neutron [req-1a270bf0-41a3-4a0e-a69b-29df28c4b49f req-7aace64e-25f2-4bc7-b633-39ff8e7ef46b service nova] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Updated VIF entry in instance network info cache for port 7e5ed9ec-a2a6-4c0a-9486-5c4280904e37. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 654.097504] env[67893]: DEBUG nova.network.neutron [req-1a270bf0-41a3-4a0e-a69b-29df28c4b49f req-7aace64e-25f2-4bc7-b633-39ff8e7ef46b service nova] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Updating instance_info_cache with network_info: [{"id": "7e5ed9ec-a2a6-4c0a-9486-5c4280904e37", "address": "fa:16:3e:c3:97:b2", "network": {"id": "f28d4f17-ac87-4b46-aae5-8b77669924a8", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1354805933-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "003516068deb46448b67dcdb7d11c345", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c4d3f69a-b086-4c3b-b976-5a848b63dfc4", "external-id": "nsx-vlan-transportzone-627", "segmentation_id": 627, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7e5ed9ec-a2", "ovs_interfaceid": "7e5ed9ec-a2a6-4c0a-9486-5c4280904e37", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 654.113763] env[67893]: DEBUG oslo_concurrency.lockutils [req-1a270bf0-41a3-4a0e-a69b-29df28c4b49f req-7aace64e-25f2-4bc7-b633-39ff8e7ef46b service nova] Releasing lock "refresh_cache-96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 654.484777] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455321, 'name': CreateVM_Task, 'duration_secs': 0.355841} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 654.485096] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 654.486273] env[67893]: DEBUG oslo_vmware.service [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbd19f52-9d2a-4a5e-85fd-04f6142b5fcb {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 654.494855] env[67893]: DEBUG oslo_concurrency.lockutils [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 654.495051] env[67893]: DEBUG oslo_concurrency.lockutils [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 654.495420] env[67893]: DEBUG oslo_concurrency.lockutils [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 654.495668] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-63d3a2c1-73c6-4afa-b192-236b9f41eb3e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 654.502732] env[67893]: DEBUG oslo_vmware.api [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Waiting for the task: (returnval){ [ 654.502732] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]5295f162-a044-bdf0-3eb3-5db16faf4c0e" [ 654.502732] env[67893]: _type = "Task" [ 654.502732] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 654.511660] env[67893]: DEBUG oslo_vmware.api [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]5295f162-a044-bdf0-3eb3-5db16faf4c0e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 654.881088] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Acquiring lock "19ab9782-9131-46ba-bbf2-cc021953046e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 654.881757] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Lock "19ab9782-9131-46ba-bbf2-cc021953046e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 654.892390] env[67893]: DEBUG nova.network.neutron [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Successfully updated port: cb1b9bfc-7a09-4c2f-9281-8fd57702ce4c {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 654.912074] env[67893]: DEBUG oslo_concurrency.lockutils [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Acquiring lock "refresh_cache-2256af1c-4ff8-46b9-b568-c25ce8886e5f" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 654.912074] env[67893]: DEBUG oslo_concurrency.lockutils [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Acquired lock "refresh_cache-2256af1c-4ff8-46b9-b568-c25ce8886e5f" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 654.912074] env[67893]: DEBUG nova.network.neutron [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 655.003135] env[67893]: DEBUG nova.network.neutron [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 655.024369] env[67893]: DEBUG oslo_concurrency.lockutils [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 655.024760] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 655.025308] env[67893]: DEBUG oslo_concurrency.lockutils [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 655.025308] env[67893]: DEBUG oslo_concurrency.lockutils [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 655.027474] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 655.027802] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7fb73c85-0448-41b0-a70c-a6ad21014e79 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 655.040018] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 655.040018] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 655.040018] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21a54015-d33c-43b1-96d6-c5aeb953cc09 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 655.049196] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-006fa4d1-20dc-4ef9-9ce0-bd90d2b23e23 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 655.058485] env[67893]: DEBUG oslo_vmware.api [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Waiting for the task: (returnval){ [ 655.058485] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]5279c3df-0842-4f45-17da-70d357afdc5a" [ 655.058485] env[67893]: _type = "Task" [ 655.058485] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 655.066423] env[67893]: DEBUG oslo_vmware.api [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]5279c3df-0842-4f45-17da-70d357afdc5a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 655.404777] env[67893]: DEBUG nova.network.neutron [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Updating instance_info_cache with network_info: [{"id": "cb1b9bfc-7a09-4c2f-9281-8fd57702ce4c", "address": "fa:16:3e:d5:d5:b0", "network": {"id": "0158027f-656f-43e3-aebf-7cbb75cfd948", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "304a5519bb7c46efb34a42749d9cf409", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a8b99a46-3e7f-4ef1-9e45-58e6cd17f210", "external-id": "nsx-vlan-transportzone-704", "segmentation_id": 704, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcb1b9bfc-7a", "ovs_interfaceid": "cb1b9bfc-7a09-4c2f-9281-8fd57702ce4c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 655.416942] env[67893]: DEBUG oslo_concurrency.lockutils [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Releasing lock "refresh_cache-2256af1c-4ff8-46b9-b568-c25ce8886e5f" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 655.417268] env[67893]: DEBUG nova.compute.manager [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Instance network_info: |[{"id": "cb1b9bfc-7a09-4c2f-9281-8fd57702ce4c", "address": "fa:16:3e:d5:d5:b0", "network": {"id": "0158027f-656f-43e3-aebf-7cbb75cfd948", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "304a5519bb7c46efb34a42749d9cf409", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a8b99a46-3e7f-4ef1-9e45-58e6cd17f210", "external-id": "nsx-vlan-transportzone-704", "segmentation_id": 704, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcb1b9bfc-7a", "ovs_interfaceid": "cb1b9bfc-7a09-4c2f-9281-8fd57702ce4c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 655.417711] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d5:d5:b0', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a8b99a46-3e7f-4ef1-9e45-58e6cd17f210', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'cb1b9bfc-7a09-4c2f-9281-8fd57702ce4c', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 655.429893] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Creating folder: Project (95903c6afc524348bee927fd80b17219). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 655.431206] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-10270c88-4655-4da3-92e5-57c304f449e8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 655.443331] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Created folder: Project (95903c6afc524348bee927fd80b17219) in parent group-v689771. [ 655.443331] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Creating folder: Instances. Parent ref: group-v689803. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 655.444606] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8d9e8c3e-dde3-4d6a-9911-cf462ba79945 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 655.452575] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Created folder: Instances in parent group-v689803. [ 655.452575] env[67893]: DEBUG oslo.service.loopingcall [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 655.452660] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 655.452937] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-30444cb1-c10c-48f4-b7ac-94d78722f285 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 655.480506] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 655.480506] env[67893]: value = "task-3455324" [ 655.480506] env[67893]: _type = "Task" [ 655.480506] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 655.490822] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455324, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 655.568864] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 655.569229] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Creating directory with path [datastore2] vmware_temp/300ba298-96e9-46a3-a9ac-bc8cbe54b1b3/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 655.570166] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-60fd65c4-2833-4ed1-a725-8cffc3082a97 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 655.582574] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Created directory with path [datastore2] vmware_temp/300ba298-96e9-46a3-a9ac-bc8cbe54b1b3/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 655.583617] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Fetch image to [datastore2] vmware_temp/300ba298-96e9-46a3-a9ac-bc8cbe54b1b3/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 655.583969] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore2] vmware_temp/300ba298-96e9-46a3-a9ac-bc8cbe54b1b3/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore2 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 655.585194] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58ffc90c-0c1b-4310-8fe1-a1f954b5e947 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 655.594021] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43e71222-c6dd-4434-9493-7cb235f6d527 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 655.607274] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1780d310-3903-4d9d-a12f-a035d288cddf {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 655.641091] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c23e4a80-dcd8-442a-888b-0183439f22f1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 655.648176] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-60b5402f-47d0-44c3-a7c7-ee8f8d76e458 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 655.668303] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore2 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 655.743268] env[67893]: DEBUG oslo_vmware.rw_handles [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/300ba298-96e9-46a3-a9ac-bc8cbe54b1b3/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 655.849975] env[67893]: DEBUG oslo_vmware.rw_handles [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 655.850531] env[67893]: DEBUG oslo_vmware.rw_handles [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/300ba298-96e9-46a3-a9ac-bc8cbe54b1b3/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 655.995218] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455324, 'name': CreateVM_Task, 'duration_secs': 0.366145} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 655.995218] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 655.995830] env[67893]: DEBUG oslo_concurrency.lockutils [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 655.996020] env[67893]: DEBUG oslo_concurrency.lockutils [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 655.996371] env[67893]: DEBUG oslo_concurrency.lockutils [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 655.996710] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-271d917b-2596-43f8-a6ab-e5a26434f115 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 656.003258] env[67893]: DEBUG oslo_vmware.api [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Waiting for the task: (returnval){ [ 656.003258] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52c8ce75-8c31-fcd5-5dde-7d9a1e3c2989" [ 656.003258] env[67893]: _type = "Task" [ 656.003258] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 656.015717] env[67893]: DEBUG oslo_vmware.api [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52c8ce75-8c31-fcd5-5dde-7d9a1e3c2989, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 656.038926] env[67893]: DEBUG nova.compute.manager [req-876807e4-773f-44e9-95a2-d2d8216dde80 req-a5087bbc-15cf-48ea-af9e-f95a6aaac433 service nova] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Received event network-vif-plugged-d6327d09-d00b-409d-ba09-b06bbfe8d034 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 656.039218] env[67893]: DEBUG oslo_concurrency.lockutils [req-876807e4-773f-44e9-95a2-d2d8216dde80 req-a5087bbc-15cf-48ea-af9e-f95a6aaac433 service nova] Acquiring lock "b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 656.039428] env[67893]: DEBUG oslo_concurrency.lockutils [req-876807e4-773f-44e9-95a2-d2d8216dde80 req-a5087bbc-15cf-48ea-af9e-f95a6aaac433 service nova] Lock "b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 656.039611] env[67893]: DEBUG oslo_concurrency.lockutils [req-876807e4-773f-44e9-95a2-d2d8216dde80 req-a5087bbc-15cf-48ea-af9e-f95a6aaac433 service nova] Lock "b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 656.039808] env[67893]: DEBUG nova.compute.manager [req-876807e4-773f-44e9-95a2-d2d8216dde80 req-a5087bbc-15cf-48ea-af9e-f95a6aaac433 service nova] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] No waiting events found dispatching network-vif-plugged-d6327d09-d00b-409d-ba09-b06bbfe8d034 {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 656.040033] env[67893]: WARNING nova.compute.manager [req-876807e4-773f-44e9-95a2-d2d8216dde80 req-a5087bbc-15cf-48ea-af9e-f95a6aaac433 service nova] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Received unexpected event network-vif-plugged-d6327d09-d00b-409d-ba09-b06bbfe8d034 for instance with vm_state building and task_state spawning. [ 656.040211] env[67893]: DEBUG nova.compute.manager [req-876807e4-773f-44e9-95a2-d2d8216dde80 req-a5087bbc-15cf-48ea-af9e-f95a6aaac433 service nova] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Received event network-changed-d6327d09-d00b-409d-ba09-b06bbfe8d034 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 656.040374] env[67893]: DEBUG nova.compute.manager [req-876807e4-773f-44e9-95a2-d2d8216dde80 req-a5087bbc-15cf-48ea-af9e-f95a6aaac433 service nova] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Refreshing instance network info cache due to event network-changed-d6327d09-d00b-409d-ba09-b06bbfe8d034. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 656.040561] env[67893]: DEBUG oslo_concurrency.lockutils [req-876807e4-773f-44e9-95a2-d2d8216dde80 req-a5087bbc-15cf-48ea-af9e-f95a6aaac433 service nova] Acquiring lock "refresh_cache-b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 656.040700] env[67893]: DEBUG oslo_concurrency.lockutils [req-876807e4-773f-44e9-95a2-d2d8216dde80 req-a5087bbc-15cf-48ea-af9e-f95a6aaac433 service nova] Acquired lock "refresh_cache-b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 656.041080] env[67893]: DEBUG nova.network.neutron [req-876807e4-773f-44e9-95a2-d2d8216dde80 req-a5087bbc-15cf-48ea-af9e-f95a6aaac433 service nova] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Refreshing network info cache for port d6327d09-d00b-409d-ba09-b06bbfe8d034 {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 656.514368] env[67893]: DEBUG oslo_concurrency.lockutils [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 656.514903] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 656.514903] env[67893]: DEBUG oslo_concurrency.lockutils [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 657.121449] env[67893]: DEBUG nova.network.neutron [req-876807e4-773f-44e9-95a2-d2d8216dde80 req-a5087bbc-15cf-48ea-af9e-f95a6aaac433 service nova] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Updated VIF entry in instance network info cache for port d6327d09-d00b-409d-ba09-b06bbfe8d034. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 657.121708] env[67893]: DEBUG nova.network.neutron [req-876807e4-773f-44e9-95a2-d2d8216dde80 req-a5087bbc-15cf-48ea-af9e-f95a6aaac433 service nova] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Updating instance_info_cache with network_info: [{"id": "d6327d09-d00b-409d-ba09-b06bbfe8d034", "address": "fa:16:3e:61:b8:a7", "network": {"id": "45d43c16-3b98-4fff-a20a-12d1f40f8f39", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082006170-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ceeb713fc8334616a1e3d122ad1a6138", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1712475b-e1be-49e0-9a18-febd305c90ad", "external-id": "nsx-vlan-transportzone-531", "segmentation_id": 531, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd6327d09-d0", "ovs_interfaceid": "d6327d09-d00b-409d-ba09-b06bbfe8d034", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 657.143446] env[67893]: DEBUG oslo_concurrency.lockutils [req-876807e4-773f-44e9-95a2-d2d8216dde80 req-a5087bbc-15cf-48ea-af9e-f95a6aaac433 service nova] Releasing lock "refresh_cache-b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 657.237175] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Acquiring lock "2eb8d698-9436-4e91-bd10-5f5200415144" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 657.237366] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Lock "2eb8d698-9436-4e91-bd10-5f5200415144" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 657.335953] env[67893]: DEBUG nova.compute.manager [req-57c16e4f-baba-45a7-a3d0-1c6bd26f913f req-22092ea8-d329-4d24-b16e-ae4f658b9a4b service nova] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Received event network-changed-d715a161-0823-4255-9b75-f8ec650c03bf {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 657.336339] env[67893]: DEBUG nova.compute.manager [req-57c16e4f-baba-45a7-a3d0-1c6bd26f913f req-22092ea8-d329-4d24-b16e-ae4f658b9a4b service nova] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Refreshing instance network info cache due to event network-changed-d715a161-0823-4255-9b75-f8ec650c03bf. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 657.336419] env[67893]: DEBUG oslo_concurrency.lockutils [req-57c16e4f-baba-45a7-a3d0-1c6bd26f913f req-22092ea8-d329-4d24-b16e-ae4f658b9a4b service nova] Acquiring lock "refresh_cache-2f69fae8-d060-4156-8880-071f5ee1f969" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 657.336890] env[67893]: DEBUG oslo_concurrency.lockutils [req-57c16e4f-baba-45a7-a3d0-1c6bd26f913f req-22092ea8-d329-4d24-b16e-ae4f658b9a4b service nova] Acquired lock "refresh_cache-2f69fae8-d060-4156-8880-071f5ee1f969" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 657.337211] env[67893]: DEBUG nova.network.neutron [req-57c16e4f-baba-45a7-a3d0-1c6bd26f913f req-22092ea8-d329-4d24-b16e-ae4f658b9a4b service nova] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Refreshing network info cache for port d715a161-0823-4255-9b75-f8ec650c03bf {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 657.454019] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Acquiring lock "5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 657.454019] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Lock "5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 658.039633] env[67893]: DEBUG nova.network.neutron [req-57c16e4f-baba-45a7-a3d0-1c6bd26f913f req-22092ea8-d329-4d24-b16e-ae4f658b9a4b service nova] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Updated VIF entry in instance network info cache for port d715a161-0823-4255-9b75-f8ec650c03bf. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 658.039977] env[67893]: DEBUG nova.network.neutron [req-57c16e4f-baba-45a7-a3d0-1c6bd26f913f req-22092ea8-d329-4d24-b16e-ae4f658b9a4b service nova] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Updating instance_info_cache with network_info: [{"id": "d715a161-0823-4255-9b75-f8ec650c03bf", "address": "fa:16:3e:db:63:ba", "network": {"id": "313a8b20-43a7-4371-9c86-5bbae9867952", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1884844710-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4aa620f2e894427a862b1a595c9e019f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ed3ffc1d-9f86-4029-857e-6cd1d383edbb", "external-id": "nsx-vlan-transportzone-759", "segmentation_id": 759, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd715a161-08", "ovs_interfaceid": "d715a161-0823-4255-9b75-f8ec650c03bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 658.052734] env[67893]: DEBUG oslo_concurrency.lockutils [req-57c16e4f-baba-45a7-a3d0-1c6bd26f913f req-22092ea8-d329-4d24-b16e-ae4f658b9a4b service nova] Releasing lock "refresh_cache-2f69fae8-d060-4156-8880-071f5ee1f969" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 658.053171] env[67893]: DEBUG nova.compute.manager [req-57c16e4f-baba-45a7-a3d0-1c6bd26f913f req-22092ea8-d329-4d24-b16e-ae4f658b9a4b service nova] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Received event network-vif-plugged-cb1b9bfc-7a09-4c2f-9281-8fd57702ce4c {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 658.053500] env[67893]: DEBUG oslo_concurrency.lockutils [req-57c16e4f-baba-45a7-a3d0-1c6bd26f913f req-22092ea8-d329-4d24-b16e-ae4f658b9a4b service nova] Acquiring lock "2256af1c-4ff8-46b9-b568-c25ce8886e5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 658.054144] env[67893]: DEBUG oslo_concurrency.lockutils [req-57c16e4f-baba-45a7-a3d0-1c6bd26f913f req-22092ea8-d329-4d24-b16e-ae4f658b9a4b service nova] Lock "2256af1c-4ff8-46b9-b568-c25ce8886e5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 658.054935] env[67893]: DEBUG oslo_concurrency.lockutils [req-57c16e4f-baba-45a7-a3d0-1c6bd26f913f req-22092ea8-d329-4d24-b16e-ae4f658b9a4b service nova] Lock "2256af1c-4ff8-46b9-b568-c25ce8886e5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 658.054935] env[67893]: DEBUG nova.compute.manager [req-57c16e4f-baba-45a7-a3d0-1c6bd26f913f req-22092ea8-d329-4d24-b16e-ae4f658b9a4b service nova] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] No waiting events found dispatching network-vif-plugged-cb1b9bfc-7a09-4c2f-9281-8fd57702ce4c {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 658.054935] env[67893]: WARNING nova.compute.manager [req-57c16e4f-baba-45a7-a3d0-1c6bd26f913f req-22092ea8-d329-4d24-b16e-ae4f658b9a4b service nova] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Received unexpected event network-vif-plugged-cb1b9bfc-7a09-4c2f-9281-8fd57702ce4c for instance with vm_state building and task_state spawning. [ 658.054935] env[67893]: DEBUG nova.compute.manager [req-57c16e4f-baba-45a7-a3d0-1c6bd26f913f req-22092ea8-d329-4d24-b16e-ae4f658b9a4b service nova] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Received event network-changed-cb1b9bfc-7a09-4c2f-9281-8fd57702ce4c {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 658.055261] env[67893]: DEBUG nova.compute.manager [req-57c16e4f-baba-45a7-a3d0-1c6bd26f913f req-22092ea8-d329-4d24-b16e-ae4f658b9a4b service nova] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Refreshing instance network info cache due to event network-changed-cb1b9bfc-7a09-4c2f-9281-8fd57702ce4c. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 658.055295] env[67893]: DEBUG oslo_concurrency.lockutils [req-57c16e4f-baba-45a7-a3d0-1c6bd26f913f req-22092ea8-d329-4d24-b16e-ae4f658b9a4b service nova] Acquiring lock "refresh_cache-2256af1c-4ff8-46b9-b568-c25ce8886e5f" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 658.055412] env[67893]: DEBUG oslo_concurrency.lockutils [req-57c16e4f-baba-45a7-a3d0-1c6bd26f913f req-22092ea8-d329-4d24-b16e-ae4f658b9a4b service nova] Acquired lock "refresh_cache-2256af1c-4ff8-46b9-b568-c25ce8886e5f" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 658.055564] env[67893]: DEBUG nova.network.neutron [req-57c16e4f-baba-45a7-a3d0-1c6bd26f913f req-22092ea8-d329-4d24-b16e-ae4f658b9a4b service nova] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Refreshing network info cache for port cb1b9bfc-7a09-4c2f-9281-8fd57702ce4c {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 658.750836] env[67893]: DEBUG nova.network.neutron [req-57c16e4f-baba-45a7-a3d0-1c6bd26f913f req-22092ea8-d329-4d24-b16e-ae4f658b9a4b service nova] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Updated VIF entry in instance network info cache for port cb1b9bfc-7a09-4c2f-9281-8fd57702ce4c. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 658.751568] env[67893]: DEBUG nova.network.neutron [req-57c16e4f-baba-45a7-a3d0-1c6bd26f913f req-22092ea8-d329-4d24-b16e-ae4f658b9a4b service nova] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Updating instance_info_cache with network_info: [{"id": "cb1b9bfc-7a09-4c2f-9281-8fd57702ce4c", "address": "fa:16:3e:d5:d5:b0", "network": {"id": "0158027f-656f-43e3-aebf-7cbb75cfd948", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "304a5519bb7c46efb34a42749d9cf409", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a8b99a46-3e7f-4ef1-9e45-58e6cd17f210", "external-id": "nsx-vlan-transportzone-704", "segmentation_id": 704, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcb1b9bfc-7a", "ovs_interfaceid": "cb1b9bfc-7a09-4c2f-9281-8fd57702ce4c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 658.766290] env[67893]: DEBUG oslo_concurrency.lockutils [req-57c16e4f-baba-45a7-a3d0-1c6bd26f913f req-22092ea8-d329-4d24-b16e-ae4f658b9a4b service nova] Releasing lock "refresh_cache-2256af1c-4ff8-46b9-b568-c25ce8886e5f" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 659.747438] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Acquiring lock "c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 659.747438] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Lock "c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 661.536733] env[67893]: DEBUG oslo_concurrency.lockutils [None req-43a8cb85-2105-433c-a1df-9ce2d35f4f66 tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Acquiring lock "a2a5e9bc-da8b-42df-9f5f-caf70d72cc0b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 661.537043] env[67893]: DEBUG oslo_concurrency.lockutils [None req-43a8cb85-2105-433c-a1df-9ce2d35f4f66 tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Lock "a2a5e9bc-da8b-42df-9f5f-caf70d72cc0b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 662.965629] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ee288a15-7b64-45cb-a1c7-d5d253212f89 tempest-ServersAdminTestJSON-469467968 tempest-ServersAdminTestJSON-469467968-project-member] Acquiring lock "2b03a2c3-33de-4fb4-b723-029652a7c780" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 662.967323] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ee288a15-7b64-45cb-a1c7-d5d253212f89 tempest-ServersAdminTestJSON-469467968 tempest-ServersAdminTestJSON-469467968-project-member] Lock "2b03a2c3-33de-4fb4-b723-029652a7c780" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 663.903915] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d9567953-f6cd-40ac-b9fd-8dd06d10f4ba tempest-ServersWithSpecificFlavorTestJSON-1749998521 tempest-ServersWithSpecificFlavorTestJSON-1749998521-project-member] Acquiring lock "63254807-dead-415e-bdf6-e85780248d8f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 663.904162] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d9567953-f6cd-40ac-b9fd-8dd06d10f4ba tempest-ServersWithSpecificFlavorTestJSON-1749998521 tempest-ServersWithSpecificFlavorTestJSON-1749998521-project-member] Lock "63254807-dead-415e-bdf6-e85780248d8f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 665.080563] env[67893]: DEBUG oslo_concurrency.lockutils [None req-14a3ca71-5c94-427c-a9f7-19e2dc8dc2b2 tempest-ServersAdminTestJSON-469467968 tempest-ServersAdminTestJSON-469467968-project-member] Acquiring lock "194fe3b9-366e-4489-9b1f-2adf2a8ac6ee" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 665.080563] env[67893]: DEBUG oslo_concurrency.lockutils [None req-14a3ca71-5c94-427c-a9f7-19e2dc8dc2b2 tempest-ServersAdminTestJSON-469467968 tempest-ServersAdminTestJSON-469467968-project-member] Lock "194fe3b9-366e-4489-9b1f-2adf2a8ac6ee" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 665.434924] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d8626e5f-8a43-4ebd-9be0-a622c7713b51 tempest-ServersTestBootFromVolume-1376030239 tempest-ServersTestBootFromVolume-1376030239-project-member] Acquiring lock "bac538d8-3dda-4851-8aa3-d60bae70b6ff" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 665.435157] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d8626e5f-8a43-4ebd-9be0-a622c7713b51 tempest-ServersTestBootFromVolume-1376030239 tempest-ServersTestBootFromVolume-1376030239-project-member] Lock "bac538d8-3dda-4851-8aa3-d60bae70b6ff" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 667.199482] env[67893]: DEBUG oslo_concurrency.lockutils [None req-64b6f36b-2c03-4c72-8c06-10aca793c471 tempest-ServerAddressesTestJSON-1960767209 tempest-ServerAddressesTestJSON-1960767209-project-member] Acquiring lock "d3019243-6b64-4d8f-87bb-ace791093969" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 667.199851] env[67893]: DEBUG oslo_concurrency.lockutils [None req-64b6f36b-2c03-4c72-8c06-10aca793c471 tempest-ServerAddressesTestJSON-1960767209 tempest-ServerAddressesTestJSON-1960767209-project-member] Lock "d3019243-6b64-4d8f-87bb-ace791093969" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 669.609045] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d9211127-9ace-48b8-aceb-69ccf40d9ddf tempest-ServerGroupTestJSON-1928902997 tempest-ServerGroupTestJSON-1928902997-project-member] Acquiring lock "e7b5f3f7-5b2d-49c7-be13-3b481f1b3ca8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 669.609364] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d9211127-9ace-48b8-aceb-69ccf40d9ddf tempest-ServerGroupTestJSON-1928902997 tempest-ServerGroupTestJSON-1928902997-project-member] Lock "e7b5f3f7-5b2d-49c7-be13-3b481f1b3ca8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 670.654939] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7d5bc7fa-334c-4f67-803a-7f02f1a95868 tempest-ServerDiagnosticsTest-519189886 tempest-ServerDiagnosticsTest-519189886-project-member] Acquiring lock "9be15a5a-2a28-412b-a893-387b8dd9a2c4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 670.654939] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7d5bc7fa-334c-4f67-803a-7f02f1a95868 tempest-ServerDiagnosticsTest-519189886 tempest-ServerDiagnosticsTest-519189886-project-member] Lock "9be15a5a-2a28-412b-a893-387b8dd9a2c4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 671.642577] env[67893]: DEBUG oslo_concurrency.lockutils [None req-59c8ebbd-5e3a-4e0c-9fac-742b627af296 tempest-AttachInterfacesUnderV243Test-1747987963 tempest-AttachInterfacesUnderV243Test-1747987963-project-member] Acquiring lock "3d091507-3ab2-45da-a366-ff5d3f107134" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 671.642830] env[67893]: DEBUG oslo_concurrency.lockutils [None req-59c8ebbd-5e3a-4e0c-9fac-742b627af296 tempest-AttachInterfacesUnderV243Test-1747987963 tempest-AttachInterfacesUnderV243Test-1747987963-project-member] Lock "3d091507-3ab2-45da-a366-ff5d3f107134" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 679.583854] env[67893]: WARNING oslo_vmware.rw_handles [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 679.583854] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 679.583854] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 679.583854] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 679.583854] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 679.583854] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 679.583854] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 679.583854] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 679.583854] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 679.583854] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 679.583854] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 679.583854] env[67893]: ERROR oslo_vmware.rw_handles [ 679.584752] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/5c9cbb4b-6bc7-425b-96fc-9cdd49e101e7/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 679.585601] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 679.586057] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Copying Virtual Disk [datastore1] vmware_temp/5c9cbb4b-6bc7-425b-96fc-9cdd49e101e7/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/5c9cbb4b-6bc7-425b-96fc-9cdd49e101e7/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 679.586358] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6ecf1bce-33ab-4e5d-b34e-2cee402ddad1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 679.596522] env[67893]: DEBUG oslo_vmware.api [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Waiting for the task: (returnval){ [ 679.596522] env[67893]: value = "task-3455328" [ 679.596522] env[67893]: _type = "Task" [ 679.596522] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 679.605197] env[67893]: DEBUG oslo_vmware.api [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Task: {'id': task-3455328, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 680.120179] env[67893]: DEBUG oslo_vmware.exceptions [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 680.120179] env[67893]: DEBUG oslo_concurrency.lockutils [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 680.124988] env[67893]: ERROR nova.compute.manager [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 680.124988] env[67893]: Faults: ['InvalidArgument'] [ 680.124988] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Traceback (most recent call last): [ 680.124988] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 680.124988] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] yield resources [ 680.124988] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 680.124988] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] self.driver.spawn(context, instance, image_meta, [ 680.124988] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 680.124988] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] self._vmops.spawn(context, instance, image_meta, injected_files, [ 680.124988] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 680.124988] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] self._fetch_image_if_missing(context, vi) [ 680.124988] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 680.125428] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] image_cache(vi, tmp_image_ds_loc) [ 680.125428] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 680.125428] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] vm_util.copy_virtual_disk( [ 680.125428] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 680.125428] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] session._wait_for_task(vmdk_copy_task) [ 680.125428] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 680.125428] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] return self.wait_for_task(task_ref) [ 680.125428] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 680.125428] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] return evt.wait() [ 680.125428] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 680.125428] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] result = hub.switch() [ 680.125428] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 680.125428] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] return self.greenlet.switch() [ 680.125793] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 680.125793] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] self.f(*self.args, **self.kw) [ 680.125793] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 680.125793] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] raise exceptions.translate_fault(task_info.error) [ 680.125793] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 680.125793] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Faults: ['InvalidArgument'] [ 680.125793] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] [ 680.125793] env[67893]: INFO nova.compute.manager [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Terminating instance [ 680.127474] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 680.127474] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 680.127879] env[67893]: DEBUG nova.compute.manager [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 680.128098] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 680.128253] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-04ce14e6-3250-4974-9d9e-fa3e9a41b31d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 680.134245] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00089cf3-03b4-4d66-87d5-6a2e984d86ef {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 680.145608] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 680.147574] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-099d7756-bece-4db3-93c1-48abb766ba34 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 680.150889] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 680.152656] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 680.153324] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0982ecf4-41c4-4763-868e-203ddc04b18d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 680.168206] env[67893]: DEBUG oslo_vmware.api [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Waiting for the task: (returnval){ [ 680.168206] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52575cd7-2ef6-b936-7c3d-708d1b1ea402" [ 680.168206] env[67893]: _type = "Task" [ 680.168206] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 680.175207] env[67893]: DEBUG oslo_vmware.api [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52575cd7-2ef6-b936-7c3d-708d1b1ea402, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 680.231447] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 680.231687] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 680.231870] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Deleting the datastore file [datastore1] 0d074bfa-7d3d-4e69-b544-36e7d9f79483 {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 680.232161] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9c5af431-f6c0-4a64-bf9a-b5217290db0b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 680.240587] env[67893]: DEBUG oslo_vmware.api [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Waiting for the task: (returnval){ [ 680.240587] env[67893]: value = "task-3455330" [ 680.240587] env[67893]: _type = "Task" [ 680.240587] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 680.251424] env[67893]: DEBUG oslo_vmware.api [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Task: {'id': task-3455330, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 680.409766] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f609ccc1-50ff-441e-8db1-7ff784a0ec32 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Acquiring lock "938d37dd-509b-4923-b192-3ce4a6d530c5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 680.409989] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f609ccc1-50ff-441e-8db1-7ff784a0ec32 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Lock "938d37dd-509b-4923-b192-3ce4a6d530c5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 680.682603] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 680.683363] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Creating directory with path [datastore1] vmware_temp/1c0c9f18-6b29-4565-a8d4-c8c6dfc6095b/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 680.683789] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c360ff09-65bf-4ce4-8733-897d62e0e889 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 680.696137] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Created directory with path [datastore1] vmware_temp/1c0c9f18-6b29-4565-a8d4-c8c6dfc6095b/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 680.698019] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Fetch image to [datastore1] vmware_temp/1c0c9f18-6b29-4565-a8d4-c8c6dfc6095b/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 680.698019] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/1c0c9f18-6b29-4565-a8d4-c8c6dfc6095b/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 680.698019] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9cb78191-7797-403d-b7c3-553f65cb2d5f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 680.707131] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fe3d5f2-113d-4ce4-9fa0-fc32af2e956c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 680.718424] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdb90f0b-0ba5-45dd-91a5-6222a1ed64fc {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 680.760977] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2ceef59-cc6d-4262-8447-674352f5ddf7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 680.769124] env[67893]: DEBUG oslo_vmware.api [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Task: {'id': task-3455330, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.071695} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 680.770872] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 680.774376] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 680.775632] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 680.775632] env[67893]: INFO nova.compute.manager [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Took 0.65 seconds to destroy the instance on the hypervisor. [ 680.777643] env[67893]: DEBUG nova.compute.claims [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 680.777856] env[67893]: DEBUG oslo_concurrency.lockutils [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 680.780470] env[67893]: DEBUG oslo_concurrency.lockutils [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.002s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 680.783773] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b6a96094-4f70-49eb-88ee-62e9a7878a86 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 680.812441] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 680.898473] env[67893]: DEBUG oslo_vmware.rw_handles [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1c0c9f18-6b29-4565-a8d4-c8c6dfc6095b/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 680.961849] env[67893]: DEBUG oslo_vmware.rw_handles [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 680.961849] env[67893]: DEBUG oslo_vmware.rw_handles [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1c0c9f18-6b29-4565-a8d4-c8c6dfc6095b/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 681.356144] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82a6c420-ed12-4d97-98b1-d2506691cbc4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 681.366182] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6239188a-9193-4223-bd5f-10a1160708ad {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 681.398583] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bfc0390-d520-4116-bfc5-476b031fa4c4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 681.406637] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29586cfd-4b3b-41ea-91fb-d36721639a74 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 681.421264] env[67893]: DEBUG nova.compute.provider_tree [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 681.431536] env[67893]: DEBUG nova.scheduler.client.report [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 681.453458] env[67893]: DEBUG oslo_concurrency.lockutils [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.673s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 681.453986] env[67893]: ERROR nova.compute.manager [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 681.453986] env[67893]: Faults: ['InvalidArgument'] [ 681.453986] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Traceback (most recent call last): [ 681.453986] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 681.453986] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] self.driver.spawn(context, instance, image_meta, [ 681.453986] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 681.453986] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] self._vmops.spawn(context, instance, image_meta, injected_files, [ 681.453986] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 681.453986] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] self._fetch_image_if_missing(context, vi) [ 681.453986] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 681.453986] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] image_cache(vi, tmp_image_ds_loc) [ 681.453986] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 681.456081] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] vm_util.copy_virtual_disk( [ 681.456081] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 681.456081] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] session._wait_for_task(vmdk_copy_task) [ 681.456081] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 681.456081] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] return self.wait_for_task(task_ref) [ 681.456081] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 681.456081] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] return evt.wait() [ 681.456081] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 681.456081] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] result = hub.switch() [ 681.456081] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 681.456081] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] return self.greenlet.switch() [ 681.456081] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 681.456081] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] self.f(*self.args, **self.kw) [ 681.456630] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 681.456630] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] raise exceptions.translate_fault(task_info.error) [ 681.456630] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 681.456630] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Faults: ['InvalidArgument'] [ 681.456630] env[67893]: ERROR nova.compute.manager [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] [ 681.456630] env[67893]: DEBUG nova.compute.utils [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 681.459168] env[67893]: DEBUG nova.compute.manager [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Build of instance 0d074bfa-7d3d-4e69-b544-36e7d9f79483 was re-scheduled: A specified parameter was not correct: fileType [ 681.459168] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 681.459770] env[67893]: DEBUG nova.compute.manager [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 681.460374] env[67893]: DEBUG nova.compute.manager [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 681.462128] env[67893]: DEBUG nova.compute.manager [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 681.462128] env[67893]: DEBUG nova.network.neutron [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 682.012540] env[67893]: DEBUG nova.network.neutron [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 682.028137] env[67893]: INFO nova.compute.manager [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 0d074bfa-7d3d-4e69-b544-36e7d9f79483] Took 0.57 seconds to deallocate network for instance. [ 682.177524] env[67893]: INFO nova.scheduler.client.report [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Deleted allocations for instance 0d074bfa-7d3d-4e69-b544-36e7d9f79483 [ 682.218997] env[67893]: DEBUG oslo_concurrency.lockutils [None req-13c71cb6-d3dd-4f50-a5b8-96a7548ea633 tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Lock "0d074bfa-7d3d-4e69-b544-36e7d9f79483" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 64.860s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 682.273980] env[67893]: DEBUG nova.compute.manager [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 682.369029] env[67893]: DEBUG oslo_concurrency.lockutils [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 682.369029] env[67893]: DEBUG oslo_concurrency.lockutils [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 682.370609] env[67893]: INFO nova.compute.claims [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 682.441901] env[67893]: DEBUG oslo_concurrency.lockutils [None req-33158d07-2c7f-4ce0-8a5f-ecfb5afddfc5 tempest-ServerAddressesNegativeTestJSON-2054882907 tempest-ServerAddressesNegativeTestJSON-2054882907-project-member] Acquiring lock "40c8659f-361a-4bf7-b16c-00bfc2c98729" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 682.442343] env[67893]: DEBUG oslo_concurrency.lockutils [None req-33158d07-2c7f-4ce0-8a5f-ecfb5afddfc5 tempest-ServerAddressesNegativeTestJSON-2054882907 tempest-ServerAddressesNegativeTestJSON-2054882907-project-member] Lock "40c8659f-361a-4bf7-b16c-00bfc2c98729" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 682.865594] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-458ed45c-fac6-49a3-b918-64e0294cb7cb {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 682.875018] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b6594a5-d2c0-4051-a50f-1fb36d95febf {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 682.908327] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1c88d2f-b381-495c-9146-360a6323ff5d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 682.917215] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e90db93d-99eb-4101-b4c7-43ec1e75e621 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 682.931399] env[67893]: DEBUG nova.compute.provider_tree [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 682.942614] env[67893]: DEBUG nova.scheduler.client.report [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 682.962686] env[67893]: DEBUG oslo_concurrency.lockutils [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.594s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 682.963905] env[67893]: DEBUG nova.compute.manager [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 683.011434] env[67893]: DEBUG nova.compute.utils [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 683.013034] env[67893]: DEBUG nova.compute.manager [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 683.013301] env[67893]: DEBUG nova.network.neutron [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 683.026962] env[67893]: DEBUG nova.compute.manager [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 683.105778] env[67893]: DEBUG nova.policy [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '65035ab919bc4461a905c4114fbb06fc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'eeccec2be84d420ca13b6050a34782e3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 683.116987] env[67893]: DEBUG nova.compute.manager [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 683.155962] env[67893]: DEBUG nova.virt.hardware [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 683.155962] env[67893]: DEBUG nova.virt.hardware [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 683.155962] env[67893]: DEBUG nova.virt.hardware [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 683.156130] env[67893]: DEBUG nova.virt.hardware [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 683.156232] env[67893]: DEBUG nova.virt.hardware [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 683.159111] env[67893]: DEBUG nova.virt.hardware [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 683.159111] env[67893]: DEBUG nova.virt.hardware [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 683.159111] env[67893]: DEBUG nova.virt.hardware [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 683.159111] env[67893]: DEBUG nova.virt.hardware [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 683.159434] env[67893]: DEBUG nova.virt.hardware [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 683.159434] env[67893]: DEBUG nova.virt.hardware [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 683.159434] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a728d8ad-c64e-4c71-84b2-f9e0bfe310d5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 683.167545] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63d848cd-0ab0-4285-b10d-26627f84153f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 683.938616] env[67893]: DEBUG nova.network.neutron [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Successfully created port: dbcc6e46-3ff7-47e3-af68-7a92c82cde9f {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 684.037627] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Acquiring lock "a9656a7e-8a7b-489e-9990-097c1e93e535" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 684.037907] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Lock "a9656a7e-8a7b-489e-9990-097c1e93e535" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 684.657186] env[67893]: DEBUG oslo_concurrency.lockutils [None req-9aeba813-a345-45e6-936d-49a7ba9d1b0a tempest-ServersAdminNegativeTestJSON-1263096693 tempest-ServersAdminNegativeTestJSON-1263096693-project-member] Acquiring lock "cb485828-0620-48fd-a9d4-a83e690f4675" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 684.658295] env[67893]: DEBUG oslo_concurrency.lockutils [None req-9aeba813-a345-45e6-936d-49a7ba9d1b0a tempest-ServersAdminNegativeTestJSON-1263096693 tempest-ServersAdminNegativeTestJSON-1263096693-project-member] Lock "cb485828-0620-48fd-a9d4-a83e690f4675" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 685.061575] env[67893]: DEBUG nova.network.neutron [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Successfully updated port: dbcc6e46-3ff7-47e3-af68-7a92c82cde9f {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 685.078874] env[67893]: DEBUG oslo_concurrency.lockutils [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Acquiring lock "refresh_cache-6520080a-8bf1-4803-9099-87c3ba6e28e4" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 685.079077] env[67893]: DEBUG oslo_concurrency.lockutils [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Acquired lock "refresh_cache-6520080a-8bf1-4803-9099-87c3ba6e28e4" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 685.079647] env[67893]: DEBUG nova.network.neutron [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 685.179931] env[67893]: DEBUG nova.network.neutron [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 685.801635] env[67893]: DEBUG nova.network.neutron [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Updating instance_info_cache with network_info: [{"id": "dbcc6e46-3ff7-47e3-af68-7a92c82cde9f", "address": "fa:16:3e:f2:fb:96", "network": {"id": "2f691698-27e1-4595-b423-4cfb42f90a01", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-374075247-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "eeccec2be84d420ca13b6050a34782e3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "47499d09-8010-4d02-ac96-4f057c104692", "external-id": "nsx-vlan-transportzone-14", "segmentation_id": 14, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdbcc6e46-3f", "ovs_interfaceid": "dbcc6e46-3ff7-47e3-af68-7a92c82cde9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 685.825719] env[67893]: DEBUG oslo_concurrency.lockutils [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Releasing lock "refresh_cache-6520080a-8bf1-4803-9099-87c3ba6e28e4" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 685.825719] env[67893]: DEBUG nova.compute.manager [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Instance network_info: |[{"id": "dbcc6e46-3ff7-47e3-af68-7a92c82cde9f", "address": "fa:16:3e:f2:fb:96", "network": {"id": "2f691698-27e1-4595-b423-4cfb42f90a01", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-374075247-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "eeccec2be84d420ca13b6050a34782e3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "47499d09-8010-4d02-ac96-4f057c104692", "external-id": "nsx-vlan-transportzone-14", "segmentation_id": 14, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdbcc6e46-3f", "ovs_interfaceid": "dbcc6e46-3ff7-47e3-af68-7a92c82cde9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 685.825975] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f2:fb:96', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '47499d09-8010-4d02-ac96-4f057c104692', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'dbcc6e46-3ff7-47e3-af68-7a92c82cde9f', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 685.833977] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Creating folder: Project (eeccec2be84d420ca13b6050a34782e3). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 685.834917] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-746c36ae-f11a-4c0c-bf27-ba6554cbce97 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 685.846598] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Created folder: Project (eeccec2be84d420ca13b6050a34782e3) in parent group-v689771. [ 685.847163] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Creating folder: Instances. Parent ref: group-v689806. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 685.847560] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-624256fb-674a-451b-a48a-7b83051a6e86 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 685.857055] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Created folder: Instances in parent group-v689806. [ 685.857451] env[67893]: DEBUG oslo.service.loopingcall [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 685.857737] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 685.860014] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c1e26195-5e13-4c7e-bd7a-77ff5f880a74 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 685.880732] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 685.880732] env[67893]: value = "task-3455333" [ 685.880732] env[67893]: _type = "Task" [ 685.880732] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 685.890628] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455333, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 686.393607] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455333, 'name': CreateVM_Task, 'duration_secs': 0.386432} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 686.393888] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 686.394509] env[67893]: DEBUG oslo_concurrency.lockutils [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 686.394666] env[67893]: DEBUG oslo_concurrency.lockutils [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 686.395499] env[67893]: DEBUG oslo_concurrency.lockutils [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 686.395499] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-85636521-61ca-40db-b2b6-184fe78f2bc1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 686.400051] env[67893]: DEBUG oslo_vmware.api [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Waiting for the task: (returnval){ [ 686.400051] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]523154e4-338e-e4db-f3c3-1f5839b2c264" [ 686.400051] env[67893]: _type = "Task" [ 686.400051] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 686.410758] env[67893]: DEBUG oslo_vmware.api [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]523154e4-338e-e4db-f3c3-1f5839b2c264, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 686.566611] env[67893]: DEBUG nova.compute.manager [req-96356bf6-c824-4b54-ac14-de94a5f62f00 req-7b728db8-0f9f-41a3-b228-12fac34cf711 service nova] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Received event network-vif-plugged-dbcc6e46-3ff7-47e3-af68-7a92c82cde9f {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 686.566869] env[67893]: DEBUG oslo_concurrency.lockutils [req-96356bf6-c824-4b54-ac14-de94a5f62f00 req-7b728db8-0f9f-41a3-b228-12fac34cf711 service nova] Acquiring lock "6520080a-8bf1-4803-9099-87c3ba6e28e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 686.567088] env[67893]: DEBUG oslo_concurrency.lockutils [req-96356bf6-c824-4b54-ac14-de94a5f62f00 req-7b728db8-0f9f-41a3-b228-12fac34cf711 service nova] Lock "6520080a-8bf1-4803-9099-87c3ba6e28e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 686.567255] env[67893]: DEBUG oslo_concurrency.lockutils [req-96356bf6-c824-4b54-ac14-de94a5f62f00 req-7b728db8-0f9f-41a3-b228-12fac34cf711 service nova] Lock "6520080a-8bf1-4803-9099-87c3ba6e28e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 686.568177] env[67893]: DEBUG nova.compute.manager [req-96356bf6-c824-4b54-ac14-de94a5f62f00 req-7b728db8-0f9f-41a3-b228-12fac34cf711 service nova] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] No waiting events found dispatching network-vif-plugged-dbcc6e46-3ff7-47e3-af68-7a92c82cde9f {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 686.568311] env[67893]: WARNING nova.compute.manager [req-96356bf6-c824-4b54-ac14-de94a5f62f00 req-7b728db8-0f9f-41a3-b228-12fac34cf711 service nova] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Received unexpected event network-vif-plugged-dbcc6e46-3ff7-47e3-af68-7a92c82cde9f for instance with vm_state building and task_state spawning. [ 686.568501] env[67893]: DEBUG nova.compute.manager [req-96356bf6-c824-4b54-ac14-de94a5f62f00 req-7b728db8-0f9f-41a3-b228-12fac34cf711 service nova] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Received event network-changed-dbcc6e46-3ff7-47e3-af68-7a92c82cde9f {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 686.568662] env[67893]: DEBUG nova.compute.manager [req-96356bf6-c824-4b54-ac14-de94a5f62f00 req-7b728db8-0f9f-41a3-b228-12fac34cf711 service nova] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Refreshing instance network info cache due to event network-changed-dbcc6e46-3ff7-47e3-af68-7a92c82cde9f. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 686.568854] env[67893]: DEBUG oslo_concurrency.lockutils [req-96356bf6-c824-4b54-ac14-de94a5f62f00 req-7b728db8-0f9f-41a3-b228-12fac34cf711 service nova] Acquiring lock "refresh_cache-6520080a-8bf1-4803-9099-87c3ba6e28e4" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 686.568992] env[67893]: DEBUG oslo_concurrency.lockutils [req-96356bf6-c824-4b54-ac14-de94a5f62f00 req-7b728db8-0f9f-41a3-b228-12fac34cf711 service nova] Acquired lock "refresh_cache-6520080a-8bf1-4803-9099-87c3ba6e28e4" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 686.569379] env[67893]: DEBUG nova.network.neutron [req-96356bf6-c824-4b54-ac14-de94a5f62f00 req-7b728db8-0f9f-41a3-b228-12fac34cf711 service nova] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Refreshing network info cache for port dbcc6e46-3ff7-47e3-af68-7a92c82cde9f {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 686.914821] env[67893]: DEBUG oslo_concurrency.lockutils [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 686.916633] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 686.917782] env[67893]: DEBUG oslo_concurrency.lockutils [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 687.234630] env[67893]: DEBUG nova.network.neutron [req-96356bf6-c824-4b54-ac14-de94a5f62f00 req-7b728db8-0f9f-41a3-b228-12fac34cf711 service nova] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Updated VIF entry in instance network info cache for port dbcc6e46-3ff7-47e3-af68-7a92c82cde9f. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 687.234978] env[67893]: DEBUG nova.network.neutron [req-96356bf6-c824-4b54-ac14-de94a5f62f00 req-7b728db8-0f9f-41a3-b228-12fac34cf711 service nova] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Updating instance_info_cache with network_info: [{"id": "dbcc6e46-3ff7-47e3-af68-7a92c82cde9f", "address": "fa:16:3e:f2:fb:96", "network": {"id": "2f691698-27e1-4595-b423-4cfb42f90a01", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-374075247-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "eeccec2be84d420ca13b6050a34782e3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "47499d09-8010-4d02-ac96-4f057c104692", "external-id": "nsx-vlan-transportzone-14", "segmentation_id": 14, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdbcc6e46-3f", "ovs_interfaceid": "dbcc6e46-3ff7-47e3-af68-7a92c82cde9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 687.250675] env[67893]: DEBUG oslo_concurrency.lockutils [req-96356bf6-c824-4b54-ac14-de94a5f62f00 req-7b728db8-0f9f-41a3-b228-12fac34cf711 service nova] Releasing lock "refresh_cache-6520080a-8bf1-4803-9099-87c3ba6e28e4" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 688.785889] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7f6166e4-36b0-4dfa-827b-6fe59bbe057c tempest-ServerPasswordTestJSON-451294384 tempest-ServerPasswordTestJSON-451294384-project-member] Acquiring lock "dd3d49f4-83c5-4a83-9674-fed5e190743c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 688.786179] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7f6166e4-36b0-4dfa-827b-6fe59bbe057c tempest-ServerPasswordTestJSON-451294384 tempest-ServerPasswordTestJSON-451294384-project-member] Lock "dd3d49f4-83c5-4a83-9674-fed5e190743c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 689.505928] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5584c123-651e-44db-84e7-9991ba49463b tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Acquiring lock "9e3bbca4-2031-4a02-819c-2c9cf720eba9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 689.506956] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5584c123-651e-44db-84e7-9991ba49463b tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Lock "9e3bbca4-2031-4a02-819c-2c9cf720eba9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 697.316219] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 697.341487] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 697.858965] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 697.861214] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 697.861214] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 697.861214] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 697.883765] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 697.883938] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 697.884825] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 697.885079] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 697.885232] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 697.885363] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 697.885542] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 697.885622] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 697.885716] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 697.885834] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 697.885955] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 697.888011] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 697.888313] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 697.888500] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 697.888736] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 697.888961] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 697.898707] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 697.898923] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 697.899102] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 697.899265] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 697.900378] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f420b69-f435-4fa1-b72e-c45f15a02d9c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 697.910585] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e74f117-5868-40ea-b5ca-2a19f6d2fbb3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 697.928205] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cda97fc1-f66b-44ac-a9cc-ba78ed08af97 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 697.934871] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08e4dc93-3e8a-476c-b62d-5137c750ebf8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 697.963929] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180928MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 697.964101] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 697.964319] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 698.061208] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance d9e47a83-7921-4cf6-ba99-fb705bc52e4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 698.061208] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 30d52736-4195-4767-89e0-8572dc96de29 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 698.061208] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 698.061208] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 698.061556] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 043c631c-bf15-4b4c-9a92-49ea51b6d405 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 698.061556] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 698.061556] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2f69fae8-d060-4156-8880-071f5ee1f969 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 698.061556] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 698.061671] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2256af1c-4ff8-46b9-b568-c25ce8886e5f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 698.061671] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 6520080a-8bf1-4803-9099-87c3ba6e28e4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 698.084562] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 19ab9782-9131-46ba-bbf2-cc021953046e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.109899] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2eb8d698-9436-4e91-bd10-5f5200415144 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.121313] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.133580] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.143367] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a2a5e9bc-da8b-42df-9f5f-caf70d72cc0b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.160238] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2b03a2c3-33de-4fb4-b723-029652a7c780 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.173107] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 63254807-dead-415e-bdf6-e85780248d8f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.186985] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 194fe3b9-366e-4489-9b1f-2adf2a8ac6ee has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.205681] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance bac538d8-3dda-4851-8aa3-d60bae70b6ff has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.219499] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance d3019243-6b64-4d8f-87bb-ace791093969 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.236798] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance e7b5f3f7-5b2d-49c7-be13-3b481f1b3ca8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.252992] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 9be15a5a-2a28-412b-a893-387b8dd9a2c4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.265759] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 3d091507-3ab2-45da-a366-ff5d3f107134 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.279317] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 938d37dd-509b-4923-b192-3ce4a6d530c5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.309435] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 40c8659f-361a-4bf7-b16c-00bfc2c98729 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.322703] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a9656a7e-8a7b-489e-9990-097c1e93e535 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.335716] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance cb485828-0620-48fd-a9d4-a83e690f4675 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.350594] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance dd3d49f4-83c5-4a83-9674-fed5e190743c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.361405] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 9e3bbca4-2031-4a02-819c-2c9cf720eba9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.361787] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 698.361787] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 698.866347] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df5e1d28-3eef-4663-a96c-fe8b645d1980 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.874412] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eef77de6-1b18-4f1d-b6fd-b3e9a13df0ae {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.906369] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53f14d8b-0394-4b67-b7c1-74c0d04c2e87 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.915023] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0625d7ae-0290-4605-a77d-a81b256059e1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.927217] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 698.937462] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 698.957358] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 698.957564] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.993s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 699.927789] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 699.928117] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 702.707352] env[67893]: WARNING oslo_vmware.rw_handles [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 702.707352] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 702.707352] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 702.707352] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 702.707352] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 702.707352] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 702.707352] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 702.707352] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 702.707352] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 702.707352] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 702.707352] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 702.707352] env[67893]: ERROR oslo_vmware.rw_handles [ 702.710714] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/300ba298-96e9-46a3-a9ac-bc8cbe54b1b3/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore2 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 702.710714] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 702.710714] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Copying Virtual Disk [datastore2] vmware_temp/300ba298-96e9-46a3-a9ac-bc8cbe54b1b3/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore2] vmware_temp/300ba298-96e9-46a3-a9ac-bc8cbe54b1b3/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 702.710714] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-5e7eeff4-1c7f-4885-87f4-0bd144928db1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 702.725777] env[67893]: DEBUG oslo_vmware.api [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Waiting for the task: (returnval){ [ 702.725777] env[67893]: value = "task-3455334" [ 702.725777] env[67893]: _type = "Task" [ 702.725777] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 702.739926] env[67893]: DEBUG oslo_vmware.api [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Task: {'id': task-3455334, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 703.237027] env[67893]: DEBUG oslo_vmware.exceptions [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 703.237509] env[67893]: DEBUG oslo_concurrency.lockutils [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 703.238181] env[67893]: ERROR nova.compute.manager [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 703.238181] env[67893]: Faults: ['InvalidArgument'] [ 703.238181] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Traceback (most recent call last): [ 703.238181] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 703.238181] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] yield resources [ 703.238181] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 703.238181] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] self.driver.spawn(context, instance, image_meta, [ 703.238181] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 703.238181] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 703.238181] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 703.238181] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] self._fetch_image_if_missing(context, vi) [ 703.238181] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 703.238181] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] image_cache(vi, tmp_image_ds_loc) [ 703.238608] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 703.238608] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] vm_util.copy_virtual_disk( [ 703.238608] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 703.238608] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] session._wait_for_task(vmdk_copy_task) [ 703.238608] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 703.238608] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] return self.wait_for_task(task_ref) [ 703.238608] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 703.238608] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] return evt.wait() [ 703.238608] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 703.238608] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] result = hub.switch() [ 703.238608] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 703.238608] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] return self.greenlet.switch() [ 703.238608] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 703.238973] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] self.f(*self.args, **self.kw) [ 703.238973] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 703.238973] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] raise exceptions.translate_fault(task_info.error) [ 703.238973] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 703.238973] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Faults: ['InvalidArgument'] [ 703.238973] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] [ 703.239442] env[67893]: INFO nova.compute.manager [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Terminating instance [ 703.242289] env[67893]: DEBUG nova.compute.manager [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 703.243623] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 703.244632] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f210bbff-34bb-4ee0-b46b-78dcf2e7d370 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.253028] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 703.253028] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-42ae4f58-7de4-48cb-9a10-5292be7830a6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.320323] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 703.320674] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Deleting contents of the VM from datastore datastore2 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 703.320922] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Deleting the datastore file [datastore2] b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1 {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 703.321444] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1f8b3dad-d881-4dcd-9019-73d3380fb690 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.328747] env[67893]: DEBUG oslo_vmware.api [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Waiting for the task: (returnval){ [ 703.328747] env[67893]: value = "task-3455336" [ 703.328747] env[67893]: _type = "Task" [ 703.328747] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 703.338391] env[67893]: DEBUG oslo_vmware.api [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Task: {'id': task-3455336, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 703.839679] env[67893]: DEBUG oslo_vmware.api [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Task: {'id': task-3455336, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073846} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 703.840711] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 703.842095] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Deleted contents of the VM from datastore datastore2 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 703.842095] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 703.842095] env[67893]: INFO nova.compute.manager [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Took 0.60 seconds to destroy the instance on the hypervisor. [ 703.848576] env[67893]: DEBUG nova.compute.claims [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 703.848576] env[67893]: DEBUG oslo_concurrency.lockutils [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 703.848576] env[67893]: DEBUG oslo_concurrency.lockutils [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 704.019601] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eb89611c-e2f2-4ff4-a31a-613d7b7b9565 tempest-ServersTestManualDisk-14099750 tempest-ServersTestManualDisk-14099750-project-member] Acquiring lock "88053de1-3cc2-4776-a56e-b34aa0c93764" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 704.019601] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eb89611c-e2f2-4ff4-a31a-613d7b7b9565 tempest-ServersTestManualDisk-14099750 tempest-ServersTestManualDisk-14099750-project-member] Lock "88053de1-3cc2-4776-a56e-b34aa0c93764" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 704.335565] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a06a750b-041e-41b2-b012-9ac72beb50f4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 704.343846] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4b61bbd-e22b-42b8-a53c-6af86a87cac9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 704.375366] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3804355a-d2dc-471b-8b6f-32170fcebb2f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 704.382912] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bc981d5-e499-41f7-9867-65839a1a3f9f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 704.398150] env[67893]: DEBUG nova.compute.provider_tree [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 704.411478] env[67893]: DEBUG nova.scheduler.client.report [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 704.425647] env[67893]: DEBUG oslo_concurrency.lockutils [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.577s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 704.426501] env[67893]: ERROR nova.compute.manager [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 704.426501] env[67893]: Faults: ['InvalidArgument'] [ 704.426501] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Traceback (most recent call last): [ 704.426501] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 704.426501] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] self.driver.spawn(context, instance, image_meta, [ 704.426501] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 704.426501] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 704.426501] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 704.426501] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] self._fetch_image_if_missing(context, vi) [ 704.426501] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 704.426501] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] image_cache(vi, tmp_image_ds_loc) [ 704.426501] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 704.426898] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] vm_util.copy_virtual_disk( [ 704.426898] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 704.426898] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] session._wait_for_task(vmdk_copy_task) [ 704.426898] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 704.426898] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] return self.wait_for_task(task_ref) [ 704.426898] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 704.426898] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] return evt.wait() [ 704.426898] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 704.426898] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] result = hub.switch() [ 704.426898] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 704.426898] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] return self.greenlet.switch() [ 704.426898] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 704.426898] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] self.f(*self.args, **self.kw) [ 704.429919] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 704.429919] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] raise exceptions.translate_fault(task_info.error) [ 704.429919] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 704.429919] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Faults: ['InvalidArgument'] [ 704.429919] env[67893]: ERROR nova.compute.manager [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] [ 704.429919] env[67893]: DEBUG nova.compute.utils [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 704.429919] env[67893]: DEBUG nova.compute.manager [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Build of instance b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1 was re-scheduled: A specified parameter was not correct: fileType [ 704.429919] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 704.429919] env[67893]: DEBUG nova.compute.manager [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 704.430239] env[67893]: DEBUG nova.compute.manager [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 704.430239] env[67893]: DEBUG nova.compute.manager [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 704.430239] env[67893]: DEBUG nova.network.neutron [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 705.293564] env[67893]: DEBUG nova.network.neutron [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 705.307545] env[67893]: INFO nova.compute.manager [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] [instance: b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1] Took 0.88 seconds to deallocate network for instance. [ 705.435830] env[67893]: INFO nova.scheduler.client.report [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Deleted allocations for instance b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1 [ 705.458933] env[67893]: DEBUG oslo_concurrency.lockutils [None req-26110984-1e0b-4926-998e-dd0cf6cc9d92 tempest-ServersTestJSON-81219247 tempest-ServersTestJSON-81219247-project-member] Lock "b8aa46e2-728a-4ff9-bc4f-5cbd0a9e43e1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 56.459s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 705.490030] env[67893]: DEBUG nova.compute.manager [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 705.552535] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 705.552793] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 705.554298] env[67893]: INFO nova.compute.claims [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 706.024214] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-948a552b-8cc0-4918-a21e-b5aae30cea7c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 706.032395] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59da7094-6d1e-411f-9bd2-bcd097a2fe38 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 706.063720] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4370e1d-bd44-438d-8518-6b4e53d55f4c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 706.071390] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b87abe6-1ea0-4fc2-a931-74c9cddbec6f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 706.085510] env[67893]: DEBUG nova.compute.provider_tree [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 706.101815] env[67893]: DEBUG nova.scheduler.client.report [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 706.121795] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.569s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 706.122339] env[67893]: DEBUG nova.compute.manager [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 706.159417] env[67893]: DEBUG nova.compute.utils [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 706.160673] env[67893]: DEBUG nova.compute.manager [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 706.160842] env[67893]: DEBUG nova.network.neutron [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 706.173540] env[67893]: DEBUG nova.compute.manager [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 706.242727] env[67893]: DEBUG nova.compute.manager [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 706.278211] env[67893]: DEBUG nova.virt.hardware [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 706.278496] env[67893]: DEBUG nova.virt.hardware [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 706.278649] env[67893]: DEBUG nova.virt.hardware [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 706.278834] env[67893]: DEBUG nova.virt.hardware [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 706.279017] env[67893]: DEBUG nova.virt.hardware [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 706.279170] env[67893]: DEBUG nova.virt.hardware [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 706.279397] env[67893]: DEBUG nova.virt.hardware [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 706.279984] env[67893]: DEBUG nova.virt.hardware [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 706.279984] env[67893]: DEBUG nova.virt.hardware [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 706.279984] env[67893]: DEBUG nova.virt.hardware [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 706.280195] env[67893]: DEBUG nova.virt.hardware [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 706.281868] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb8e433c-9958-4e92-8407-1ff85ecd326b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 706.290018] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcd1b4bc-98c6-438a-9e96-955732ba0e43 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 706.502139] env[67893]: DEBUG nova.policy [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '81831258a1384f69940a78a5c273b0e0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '99680cfc5d84408c8a137f1e304196ff', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 707.237900] env[67893]: DEBUG nova.network.neutron [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Successfully created port: bcc8432f-d448-4c8b-9785-adcb1e386725 {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 708.093453] env[67893]: DEBUG nova.compute.manager [req-99542e8a-0243-433c-90e8-d91f7b27a012 req-c1038ea3-a98e-41bc-a885-c5b3759403ce service nova] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Received event network-vif-plugged-bcc8432f-d448-4c8b-9785-adcb1e386725 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 708.093753] env[67893]: DEBUG oslo_concurrency.lockutils [req-99542e8a-0243-433c-90e8-d91f7b27a012 req-c1038ea3-a98e-41bc-a885-c5b3759403ce service nova] Acquiring lock "19ab9782-9131-46ba-bbf2-cc021953046e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 708.093882] env[67893]: DEBUG oslo_concurrency.lockutils [req-99542e8a-0243-433c-90e8-d91f7b27a012 req-c1038ea3-a98e-41bc-a885-c5b3759403ce service nova] Lock "19ab9782-9131-46ba-bbf2-cc021953046e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 708.094057] env[67893]: DEBUG oslo_concurrency.lockutils [req-99542e8a-0243-433c-90e8-d91f7b27a012 req-c1038ea3-a98e-41bc-a885-c5b3759403ce service nova] Lock "19ab9782-9131-46ba-bbf2-cc021953046e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 708.094228] env[67893]: DEBUG nova.compute.manager [req-99542e8a-0243-433c-90e8-d91f7b27a012 req-c1038ea3-a98e-41bc-a885-c5b3759403ce service nova] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] No waiting events found dispatching network-vif-plugged-bcc8432f-d448-4c8b-9785-adcb1e386725 {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 708.094377] env[67893]: WARNING nova.compute.manager [req-99542e8a-0243-433c-90e8-d91f7b27a012 req-c1038ea3-a98e-41bc-a885-c5b3759403ce service nova] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Received unexpected event network-vif-plugged-bcc8432f-d448-4c8b-9785-adcb1e386725 for instance with vm_state building and task_state spawning. [ 708.270981] env[67893]: DEBUG nova.network.neutron [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Successfully updated port: bcc8432f-d448-4c8b-9785-adcb1e386725 {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 708.282529] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Acquiring lock "refresh_cache-19ab9782-9131-46ba-bbf2-cc021953046e" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 708.282763] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Acquired lock "refresh_cache-19ab9782-9131-46ba-bbf2-cc021953046e" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 708.282840] env[67893]: DEBUG nova.network.neutron [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 708.360941] env[67893]: DEBUG nova.network.neutron [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 708.673243] env[67893]: DEBUG nova.network.neutron [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Updating instance_info_cache with network_info: [{"id": "bcc8432f-d448-4c8b-9785-adcb1e386725", "address": "fa:16:3e:50:23:5b", "network": {"id": "0158027f-656f-43e3-aebf-7cbb75cfd948", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.253", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "304a5519bb7c46efb34a42749d9cf409", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a8b99a46-3e7f-4ef1-9e45-58e6cd17f210", "external-id": "nsx-vlan-transportzone-704", "segmentation_id": 704, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbcc8432f-d4", "ovs_interfaceid": "bcc8432f-d448-4c8b-9785-adcb1e386725", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 708.684251] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Releasing lock "refresh_cache-19ab9782-9131-46ba-bbf2-cc021953046e" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 708.685031] env[67893]: DEBUG nova.compute.manager [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Instance network_info: |[{"id": "bcc8432f-d448-4c8b-9785-adcb1e386725", "address": "fa:16:3e:50:23:5b", "network": {"id": "0158027f-656f-43e3-aebf-7cbb75cfd948", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.253", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "304a5519bb7c46efb34a42749d9cf409", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a8b99a46-3e7f-4ef1-9e45-58e6cd17f210", "external-id": "nsx-vlan-transportzone-704", "segmentation_id": 704, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbcc8432f-d4", "ovs_interfaceid": "bcc8432f-d448-4c8b-9785-adcb1e386725", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 708.685160] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:50:23:5b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a8b99a46-3e7f-4ef1-9e45-58e6cd17f210', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'bcc8432f-d448-4c8b-9785-adcb1e386725', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 708.692444] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Creating folder: Project (99680cfc5d84408c8a137f1e304196ff). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 708.693019] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f547e316-7045-4fc4-aaad-64c49ff6fbb1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 708.703016] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Created folder: Project (99680cfc5d84408c8a137f1e304196ff) in parent group-v689771. [ 708.703261] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Creating folder: Instances. Parent ref: group-v689809. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 708.703499] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-95c7b4d9-4da4-4a9d-ab23-7891b39084b9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 708.711983] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Created folder: Instances in parent group-v689809. [ 708.712226] env[67893]: DEBUG oslo.service.loopingcall [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 708.712412] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 708.712635] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-cc78b551-10f5-4ded-b42c-3dff8f9cc4f2 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 708.735018] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 708.735018] env[67893]: value = "task-3455339" [ 708.735018] env[67893]: _type = "Task" [ 708.735018] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 708.742059] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455339, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 709.243139] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455339, 'name': CreateVM_Task, 'duration_secs': 0.360219} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 709.243421] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 709.243974] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 709.244174] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 709.244453] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 709.244700] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0f336033-2c8e-4170-8b7a-ab11cb7f6705 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 709.249174] env[67893]: DEBUG oslo_vmware.api [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Waiting for the task: (returnval){ [ 709.249174] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]5264278f-1edf-011e-d84b-b4f8634114d3" [ 709.249174] env[67893]: _type = "Task" [ 709.249174] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 709.257231] env[67893]: DEBUG oslo_vmware.api [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]5264278f-1edf-011e-d84b-b4f8634114d3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 709.759517] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 709.759517] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 709.759739] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 710.528795] env[67893]: DEBUG nova.compute.manager [req-a7f4380b-9e59-4407-be0a-72c6763d5a10 req-b5779821-1e10-4e1f-a19b-668a657f13f6 service nova] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Received event network-changed-bcc8432f-d448-4c8b-9785-adcb1e386725 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 710.529151] env[67893]: DEBUG nova.compute.manager [req-a7f4380b-9e59-4407-be0a-72c6763d5a10 req-b5779821-1e10-4e1f-a19b-668a657f13f6 service nova] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Refreshing instance network info cache due to event network-changed-bcc8432f-d448-4c8b-9785-adcb1e386725. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 710.529240] env[67893]: DEBUG oslo_concurrency.lockutils [req-a7f4380b-9e59-4407-be0a-72c6763d5a10 req-b5779821-1e10-4e1f-a19b-668a657f13f6 service nova] Acquiring lock "refresh_cache-19ab9782-9131-46ba-bbf2-cc021953046e" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 710.529392] env[67893]: DEBUG oslo_concurrency.lockutils [req-a7f4380b-9e59-4407-be0a-72c6763d5a10 req-b5779821-1e10-4e1f-a19b-668a657f13f6 service nova] Acquired lock "refresh_cache-19ab9782-9131-46ba-bbf2-cc021953046e" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 710.529557] env[67893]: DEBUG nova.network.neutron [req-a7f4380b-9e59-4407-be0a-72c6763d5a10 req-b5779821-1e10-4e1f-a19b-668a657f13f6 service nova] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Refreshing network info cache for port bcc8432f-d448-4c8b-9785-adcb1e386725 {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 710.884793] env[67893]: DEBUG nova.network.neutron [req-a7f4380b-9e59-4407-be0a-72c6763d5a10 req-b5779821-1e10-4e1f-a19b-668a657f13f6 service nova] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Updated VIF entry in instance network info cache for port bcc8432f-d448-4c8b-9785-adcb1e386725. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 710.887038] env[67893]: DEBUG nova.network.neutron [req-a7f4380b-9e59-4407-be0a-72c6763d5a10 req-b5779821-1e10-4e1f-a19b-668a657f13f6 service nova] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Updating instance_info_cache with network_info: [{"id": "bcc8432f-d448-4c8b-9785-adcb1e386725", "address": "fa:16:3e:50:23:5b", "network": {"id": "0158027f-656f-43e3-aebf-7cbb75cfd948", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.253", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "304a5519bb7c46efb34a42749d9cf409", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a8b99a46-3e7f-4ef1-9e45-58e6cd17f210", "external-id": "nsx-vlan-transportzone-704", "segmentation_id": 704, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbcc8432f-d4", "ovs_interfaceid": "bcc8432f-d448-4c8b-9785-adcb1e386725", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 710.894627] env[67893]: DEBUG oslo_concurrency.lockutils [req-a7f4380b-9e59-4407-be0a-72c6763d5a10 req-b5779821-1e10-4e1f-a19b-668a657f13f6 service nova] Releasing lock "refresh_cache-19ab9782-9131-46ba-bbf2-cc021953046e" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 713.939981] env[67893]: DEBUG oslo_concurrency.lockutils [None req-74732dd9-b3b5-4eab-800b-ffe44686edab tempest-InstanceActionsTestJSON-1015446130 tempest-InstanceActionsTestJSON-1015446130-project-member] Acquiring lock "cb76498a-b404-40f3-ac3f-93aea525abee" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 713.940620] env[67893]: DEBUG oslo_concurrency.lockutils [None req-74732dd9-b3b5-4eab-800b-ffe44686edab tempest-InstanceActionsTestJSON-1015446130 tempest-InstanceActionsTestJSON-1015446130-project-member] Lock "cb76498a-b404-40f3-ac3f-93aea525abee" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 729.599758] env[67893]: WARNING oslo_vmware.rw_handles [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 729.599758] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 729.599758] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 729.599758] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 729.599758] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 729.599758] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 729.599758] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 729.599758] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 729.599758] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 729.599758] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 729.599758] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 729.599758] env[67893]: ERROR oslo_vmware.rw_handles [ 729.600298] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/1c0c9f18-6b29-4565-a8d4-c8c6dfc6095b/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 729.601811] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 729.602083] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Copying Virtual Disk [datastore1] vmware_temp/1c0c9f18-6b29-4565-a8d4-c8c6dfc6095b/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/1c0c9f18-6b29-4565-a8d4-c8c6dfc6095b/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 729.602388] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-758932c9-85a1-4e56-b83a-58446d511bed {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 729.612052] env[67893]: DEBUG oslo_vmware.api [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Waiting for the task: (returnval){ [ 729.612052] env[67893]: value = "task-3455340" [ 729.612052] env[67893]: _type = "Task" [ 729.612052] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 729.619767] env[67893]: DEBUG oslo_vmware.api [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Task: {'id': task-3455340, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 730.122743] env[67893]: DEBUG oslo_vmware.exceptions [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 730.123027] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 730.123606] env[67893]: ERROR nova.compute.manager [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 730.123606] env[67893]: Faults: ['InvalidArgument'] [ 730.123606] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] Traceback (most recent call last): [ 730.123606] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 730.123606] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] yield resources [ 730.123606] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 730.123606] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] self.driver.spawn(context, instance, image_meta, [ 730.123606] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 730.123606] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] self._vmops.spawn(context, instance, image_meta, injected_files, [ 730.123606] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 730.123606] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] self._fetch_image_if_missing(context, vi) [ 730.123606] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 730.123875] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] image_cache(vi, tmp_image_ds_loc) [ 730.123875] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 730.123875] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] vm_util.copy_virtual_disk( [ 730.123875] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 730.123875] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] session._wait_for_task(vmdk_copy_task) [ 730.123875] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 730.123875] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] return self.wait_for_task(task_ref) [ 730.123875] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 730.123875] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] return evt.wait() [ 730.123875] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 730.123875] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] result = hub.switch() [ 730.123875] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 730.123875] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] return self.greenlet.switch() [ 730.124339] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 730.124339] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] self.f(*self.args, **self.kw) [ 730.124339] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 730.124339] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] raise exceptions.translate_fault(task_info.error) [ 730.124339] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 730.124339] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] Faults: ['InvalidArgument'] [ 730.124339] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] [ 730.124339] env[67893]: INFO nova.compute.manager [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Terminating instance [ 730.125817] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 730.125817] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 730.126096] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-430a8a80-2f9c-48e1-a6bd-048ec04f5071 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.128446] env[67893]: DEBUG nova.compute.manager [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 730.128620] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 730.129390] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bfc9f51-223a-4248-a592-dbe517cd4263 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.135956] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 730.136199] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-cc09c720-57fa-4757-a33c-4193d4baa162 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.138459] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 730.138643] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 730.139627] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7e19539d-f048-4804-960e-8cdbf4f4e359 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.144175] env[67893]: DEBUG oslo_vmware.api [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Waiting for the task: (returnval){ [ 730.144175] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52a04003-bfca-04c7-3634-72bd94140354" [ 730.144175] env[67893]: _type = "Task" [ 730.144175] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 730.151395] env[67893]: DEBUG oslo_vmware.api [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52a04003-bfca-04c7-3634-72bd94140354, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 730.210483] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 730.210746] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 730.210941] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Deleting the datastore file [datastore1] 30d52736-4195-4767-89e0-8572dc96de29 {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 730.211214] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3f299fee-7938-49c3-857d-379ed54021d6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.217727] env[67893]: DEBUG oslo_vmware.api [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Waiting for the task: (returnval){ [ 730.217727] env[67893]: value = "task-3455342" [ 730.217727] env[67893]: _type = "Task" [ 730.217727] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 730.225328] env[67893]: DEBUG oslo_vmware.api [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Task: {'id': task-3455342, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 730.655026] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 730.655307] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Creating directory with path [datastore1] vmware_temp/ead30d55-41eb-43b5-acaa-aa57beff1aa9/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 730.655574] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e22d42ab-15a2-4d3c-9174-6aacf00d9b08 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.668062] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Created directory with path [datastore1] vmware_temp/ead30d55-41eb-43b5-acaa-aa57beff1aa9/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 730.668062] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Fetch image to [datastore1] vmware_temp/ead30d55-41eb-43b5-acaa-aa57beff1aa9/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 730.668062] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/ead30d55-41eb-43b5-acaa-aa57beff1aa9/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 730.668674] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-caab8346-0bb7-4b1f-bee1-dd18675a0fe0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.675287] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53bfcb7e-ef54-4d33-80d6-bed9182c5625 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.684528] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c56f0deb-270e-4878-a7b2-4ec6daff767c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.714127] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-379a9d64-8881-4f1b-97d6-57e86b7dd3ac {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.722433] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-747eb61c-7980-408c-bd29-b9cb5d254df4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.729966] env[67893]: DEBUG oslo_vmware.api [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Task: {'id': task-3455342, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07837} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 730.730697] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 730.730697] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 730.730697] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 730.730697] env[67893]: INFO nova.compute.manager [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Took 0.60 seconds to destroy the instance on the hypervisor. [ 730.734826] env[67893]: DEBUG nova.compute.claims [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 730.734998] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 730.735250] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 730.744726] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 730.806519] env[67893]: DEBUG oslo_vmware.rw_handles [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ead30d55-41eb-43b5-acaa-aa57beff1aa9/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 730.867156] env[67893]: DEBUG oslo_vmware.rw_handles [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 730.867402] env[67893]: DEBUG oslo_vmware.rw_handles [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ead30d55-41eb-43b5-acaa-aa57beff1aa9/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 731.164445] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c46dc52-53cf-477e-8c24-1da47bc269bc {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 731.171947] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c905face-4a94-4f40-9f20-52bb7142deed {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 731.200560] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a6f6712-e3f3-47c7-993f-59948a4227c8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 731.207349] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5ab85c1-3ded-426a-904d-b8bac4e3e05c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 731.221022] env[67893]: DEBUG nova.compute.provider_tree [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 731.228765] env[67893]: DEBUG nova.scheduler.client.report [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 731.245208] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.510s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 731.245737] env[67893]: ERROR nova.compute.manager [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 731.245737] env[67893]: Faults: ['InvalidArgument'] [ 731.245737] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] Traceback (most recent call last): [ 731.245737] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 731.245737] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] self.driver.spawn(context, instance, image_meta, [ 731.245737] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 731.245737] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] self._vmops.spawn(context, instance, image_meta, injected_files, [ 731.245737] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 731.245737] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] self._fetch_image_if_missing(context, vi) [ 731.245737] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 731.245737] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] image_cache(vi, tmp_image_ds_loc) [ 731.245737] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 731.246253] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] vm_util.copy_virtual_disk( [ 731.246253] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 731.246253] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] session._wait_for_task(vmdk_copy_task) [ 731.246253] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 731.246253] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] return self.wait_for_task(task_ref) [ 731.246253] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 731.246253] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] return evt.wait() [ 731.246253] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 731.246253] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] result = hub.switch() [ 731.246253] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 731.246253] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] return self.greenlet.switch() [ 731.246253] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 731.246253] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] self.f(*self.args, **self.kw) [ 731.246545] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 731.246545] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] raise exceptions.translate_fault(task_info.error) [ 731.246545] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 731.246545] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] Faults: ['InvalidArgument'] [ 731.246545] env[67893]: ERROR nova.compute.manager [instance: 30d52736-4195-4767-89e0-8572dc96de29] [ 731.246545] env[67893]: DEBUG nova.compute.utils [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 731.248125] env[67893]: DEBUG nova.compute.manager [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Build of instance 30d52736-4195-4767-89e0-8572dc96de29 was re-scheduled: A specified parameter was not correct: fileType [ 731.248125] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 731.248533] env[67893]: DEBUG nova.compute.manager [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 731.248721] env[67893]: DEBUG nova.compute.manager [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 731.248869] env[67893]: DEBUG nova.compute.manager [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 731.249054] env[67893]: DEBUG nova.network.neutron [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 731.604016] env[67893]: DEBUG nova.network.neutron [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 731.614629] env[67893]: INFO nova.compute.manager [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] [instance: 30d52736-4195-4767-89e0-8572dc96de29] Took 0.37 seconds to deallocate network for instance. [ 731.714218] env[67893]: INFO nova.scheduler.client.report [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Deleted allocations for instance 30d52736-4195-4767-89e0-8572dc96de29 [ 731.736462] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f79284d8-a1ca-4825-b54a-a3ca50985cdd tempest-FloatingIPsAssociationTestJSON-1635449596 tempest-FloatingIPsAssociationTestJSON-1635449596-project-member] Lock "30d52736-4195-4767-89e0-8572dc96de29" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 104.419s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 731.761975] env[67893]: DEBUG nova.compute.manager [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 731.829281] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 731.829534] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 731.831022] env[67893]: INFO nova.compute.claims [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 732.230053] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b913ddf-28e1-434c-938a-f808f7aa9000 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.237618] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19e318c4-2913-4259-9385-d1423778874a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.267056] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f01f9e5-d075-4eb6-a69a-4a4c721f847b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.273832] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc46085a-dbb9-42e5-91f6-3bd34827017b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.287074] env[67893]: DEBUG nova.compute.provider_tree [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 732.297708] env[67893]: DEBUG nova.scheduler.client.report [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 732.311690] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.482s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 732.312185] env[67893]: DEBUG nova.compute.manager [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 732.344972] env[67893]: DEBUG nova.compute.utils [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 732.346568] env[67893]: DEBUG nova.compute.manager [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 732.346750] env[67893]: DEBUG nova.network.neutron [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 732.354998] env[67893]: DEBUG nova.compute.manager [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 732.420342] env[67893]: DEBUG nova.compute.manager [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 732.423667] env[67893]: DEBUG nova.policy [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '81831258a1384f69940a78a5c273b0e0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '99680cfc5d84408c8a137f1e304196ff', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 732.545572] env[67893]: DEBUG nova.virt.hardware [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 732.545700] env[67893]: DEBUG nova.virt.hardware [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 732.545862] env[67893]: DEBUG nova.virt.hardware [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 732.546018] env[67893]: DEBUG nova.virt.hardware [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 732.546597] env[67893]: DEBUG nova.virt.hardware [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 732.546597] env[67893]: DEBUG nova.virt.hardware [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 732.546805] env[67893]: DEBUG nova.virt.hardware [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 732.547025] env[67893]: DEBUG nova.virt.hardware [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 732.547408] env[67893]: DEBUG nova.virt.hardware [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 732.547478] env[67893]: DEBUG nova.virt.hardware [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 732.547811] env[67893]: DEBUG nova.virt.hardware [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 732.548568] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcfeb6b1-d28b-4752-a48c-8f828eb3dd02 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.557575] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-460c8b53-ce1e-4a9b-b386-008d2c30a782 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.833125] env[67893]: DEBUG nova.network.neutron [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Successfully created port: 9d1e7816-6aa0-4f0e-905f-09284dd52e1a {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 733.835014] env[67893]: DEBUG nova.network.neutron [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Successfully updated port: 9d1e7816-6aa0-4f0e-905f-09284dd52e1a {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 733.854291] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Acquiring lock "refresh_cache-2eb8d698-9436-4e91-bd10-5f5200415144" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 733.854455] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Acquired lock "refresh_cache-2eb8d698-9436-4e91-bd10-5f5200415144" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 733.854606] env[67893]: DEBUG nova.network.neutron [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 733.901096] env[67893]: DEBUG nova.compute.manager [req-244d0dae-4275-4fcf-8c19-5430f859539a req-5477ff1e-cb9f-4e19-a25b-c1c75f660c2c service nova] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Received event network-vif-plugged-9d1e7816-6aa0-4f0e-905f-09284dd52e1a {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 733.901912] env[67893]: DEBUG oslo_concurrency.lockutils [req-244d0dae-4275-4fcf-8c19-5430f859539a req-5477ff1e-cb9f-4e19-a25b-c1c75f660c2c service nova] Acquiring lock "2eb8d698-9436-4e91-bd10-5f5200415144-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 733.903077] env[67893]: DEBUG oslo_concurrency.lockutils [req-244d0dae-4275-4fcf-8c19-5430f859539a req-5477ff1e-cb9f-4e19-a25b-c1c75f660c2c service nova] Lock "2eb8d698-9436-4e91-bd10-5f5200415144-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 733.904027] env[67893]: DEBUG oslo_concurrency.lockutils [req-244d0dae-4275-4fcf-8c19-5430f859539a req-5477ff1e-cb9f-4e19-a25b-c1c75f660c2c service nova] Lock "2eb8d698-9436-4e91-bd10-5f5200415144-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 733.904027] env[67893]: DEBUG nova.compute.manager [req-244d0dae-4275-4fcf-8c19-5430f859539a req-5477ff1e-cb9f-4e19-a25b-c1c75f660c2c service nova] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] No waiting events found dispatching network-vif-plugged-9d1e7816-6aa0-4f0e-905f-09284dd52e1a {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 733.904651] env[67893]: WARNING nova.compute.manager [req-244d0dae-4275-4fcf-8c19-5430f859539a req-5477ff1e-cb9f-4e19-a25b-c1c75f660c2c service nova] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Received unexpected event network-vif-plugged-9d1e7816-6aa0-4f0e-905f-09284dd52e1a for instance with vm_state building and task_state spawning. [ 733.908681] env[67893]: DEBUG nova.network.neutron [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 734.160746] env[67893]: DEBUG nova.network.neutron [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Updating instance_info_cache with network_info: [{"id": "9d1e7816-6aa0-4f0e-905f-09284dd52e1a", "address": "fa:16:3e:e4:36:f4", "network": {"id": "0158027f-656f-43e3-aebf-7cbb75cfd948", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.72", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "304a5519bb7c46efb34a42749d9cf409", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a8b99a46-3e7f-4ef1-9e45-58e6cd17f210", "external-id": "nsx-vlan-transportzone-704", "segmentation_id": 704, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9d1e7816-6a", "ovs_interfaceid": "9d1e7816-6aa0-4f0e-905f-09284dd52e1a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 734.174339] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Releasing lock "refresh_cache-2eb8d698-9436-4e91-bd10-5f5200415144" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 734.175788] env[67893]: DEBUG nova.compute.manager [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Instance network_info: |[{"id": "9d1e7816-6aa0-4f0e-905f-09284dd52e1a", "address": "fa:16:3e:e4:36:f4", "network": {"id": "0158027f-656f-43e3-aebf-7cbb75cfd948", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.72", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "304a5519bb7c46efb34a42749d9cf409", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a8b99a46-3e7f-4ef1-9e45-58e6cd17f210", "external-id": "nsx-vlan-transportzone-704", "segmentation_id": 704, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9d1e7816-6a", "ovs_interfaceid": "9d1e7816-6aa0-4f0e-905f-09284dd52e1a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 734.178075] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e4:36:f4', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a8b99a46-3e7f-4ef1-9e45-58e6cd17f210', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '9d1e7816-6aa0-4f0e-905f-09284dd52e1a', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 734.184758] env[67893]: DEBUG oslo.service.loopingcall [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 734.185379] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 734.185817] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6de49810-f844-4622-a3b5-1c52007f3e4a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 734.210400] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 734.210400] env[67893]: value = "task-3455343" [ 734.210400] env[67893]: _type = "Task" [ 734.210400] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 734.220142] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455343, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 734.720742] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455343, 'name': CreateVM_Task, 'duration_secs': 0.302483} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 734.721357] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 734.722074] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 734.723530] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 734.723530] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 734.723530] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-22d1fca8-dd19-4222-a88d-e409143a5f0a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 734.727617] env[67893]: DEBUG oslo_vmware.api [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Waiting for the task: (returnval){ [ 734.727617] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52626459-7aba-2881-a11b-295ff8e5ce76" [ 734.727617] env[67893]: _type = "Task" [ 734.727617] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 734.736719] env[67893]: DEBUG oslo_vmware.api [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52626459-7aba-2881-a11b-295ff8e5ce76, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 735.239606] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 735.239924] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 735.241686] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 735.976263] env[67893]: DEBUG nova.compute.manager [req-ed4eea71-d49d-415e-b0ad-d50647cdbba3 req-4e3379bb-9fcd-4adb-98c9-b6b842383406 service nova] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Received event network-changed-9d1e7816-6aa0-4f0e-905f-09284dd52e1a {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 735.976263] env[67893]: DEBUG nova.compute.manager [req-ed4eea71-d49d-415e-b0ad-d50647cdbba3 req-4e3379bb-9fcd-4adb-98c9-b6b842383406 service nova] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Refreshing instance network info cache due to event network-changed-9d1e7816-6aa0-4f0e-905f-09284dd52e1a. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 735.976263] env[67893]: DEBUG oslo_concurrency.lockutils [req-ed4eea71-d49d-415e-b0ad-d50647cdbba3 req-4e3379bb-9fcd-4adb-98c9-b6b842383406 service nova] Acquiring lock "refresh_cache-2eb8d698-9436-4e91-bd10-5f5200415144" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 735.976263] env[67893]: DEBUG oslo_concurrency.lockutils [req-ed4eea71-d49d-415e-b0ad-d50647cdbba3 req-4e3379bb-9fcd-4adb-98c9-b6b842383406 service nova] Acquired lock "refresh_cache-2eb8d698-9436-4e91-bd10-5f5200415144" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 735.976263] env[67893]: DEBUG nova.network.neutron [req-ed4eea71-d49d-415e-b0ad-d50647cdbba3 req-4e3379bb-9fcd-4adb-98c9-b6b842383406 service nova] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Refreshing network info cache for port 9d1e7816-6aa0-4f0e-905f-09284dd52e1a {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 736.333439] env[67893]: DEBUG nova.network.neutron [req-ed4eea71-d49d-415e-b0ad-d50647cdbba3 req-4e3379bb-9fcd-4adb-98c9-b6b842383406 service nova] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Updated VIF entry in instance network info cache for port 9d1e7816-6aa0-4f0e-905f-09284dd52e1a. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 736.333782] env[67893]: DEBUG nova.network.neutron [req-ed4eea71-d49d-415e-b0ad-d50647cdbba3 req-4e3379bb-9fcd-4adb-98c9-b6b842383406 service nova] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Updating instance_info_cache with network_info: [{"id": "9d1e7816-6aa0-4f0e-905f-09284dd52e1a", "address": "fa:16:3e:e4:36:f4", "network": {"id": "0158027f-656f-43e3-aebf-7cbb75cfd948", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.72", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "304a5519bb7c46efb34a42749d9cf409", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a8b99a46-3e7f-4ef1-9e45-58e6cd17f210", "external-id": "nsx-vlan-transportzone-704", "segmentation_id": 704, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9d1e7816-6a", "ovs_interfaceid": "9d1e7816-6aa0-4f0e-905f-09284dd52e1a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 736.351216] env[67893]: DEBUG oslo_concurrency.lockutils [req-ed4eea71-d49d-415e-b0ad-d50647cdbba3 req-4e3379bb-9fcd-4adb-98c9-b6b842383406 service nova] Releasing lock "refresh_cache-2eb8d698-9436-4e91-bd10-5f5200415144" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 739.331344] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Acquiring lock "fcae7119-6233-4a52-9e52-1147f2b10ddc" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 739.331718] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Lock "fcae7119-6233-4a52-9e52-1147f2b10ddc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.005s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 757.859918] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 757.860259] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 758.854716] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 758.858254] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 758.858396] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 758.858551] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 759.858617] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 759.858617] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 759.858906] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 759.882322] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 759.882495] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 759.882623] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 759.882752] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 759.882877] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 759.883009] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 759.883140] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 759.883261] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 759.883378] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 759.883520] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 759.883613] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 759.884135] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 759.884312] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 759.884472] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 759.897822] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 759.898048] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 759.898222] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 759.898394] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 759.900656] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef2cbe84-bff6-439c-8b55-fc1cd81406a7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 759.910216] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99fb6837-a6e7-4136-af8f-464aacab0738 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 759.925730] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a1d6af7-4cbf-4501-89df-49281938d17c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 759.932292] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-980acace-24c4-4e97-bf17-311a4e31351c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 759.963371] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181001MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 759.963524] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 759.963714] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 760.046620] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance d9e47a83-7921-4cf6-ba99-fb705bc52e4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 760.047625] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 760.047625] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 760.047625] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 043c631c-bf15-4b4c-9a92-49ea51b6d405 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 760.047625] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 760.047810] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2f69fae8-d060-4156-8880-071f5ee1f969 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 760.047810] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2256af1c-4ff8-46b9-b568-c25ce8886e5f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 760.047810] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 6520080a-8bf1-4803-9099-87c3ba6e28e4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 760.047810] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 19ab9782-9131-46ba-bbf2-cc021953046e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 760.047991] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2eb8d698-9436-4e91-bd10-5f5200415144 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 760.058186] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.068520] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.077486] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a2a5e9bc-da8b-42df-9f5f-caf70d72cc0b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.086614] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2b03a2c3-33de-4fb4-b723-029652a7c780 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.095395] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 63254807-dead-415e-bdf6-e85780248d8f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.104445] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 194fe3b9-366e-4489-9b1f-2adf2a8ac6ee has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.113391] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance bac538d8-3dda-4851-8aa3-d60bae70b6ff has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.122175] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance d3019243-6b64-4d8f-87bb-ace791093969 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.131413] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance e7b5f3f7-5b2d-49c7-be13-3b481f1b3ca8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.139839] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 9be15a5a-2a28-412b-a893-387b8dd9a2c4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.148666] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 3d091507-3ab2-45da-a366-ff5d3f107134 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.157643] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 938d37dd-509b-4923-b192-3ce4a6d530c5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.166840] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 40c8659f-361a-4bf7-b16c-00bfc2c98729 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.177140] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a9656a7e-8a7b-489e-9990-097c1e93e535 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.187571] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance cb485828-0620-48fd-a9d4-a83e690f4675 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.199192] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance dd3d49f4-83c5-4a83-9674-fed5e190743c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.209626] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 9e3bbca4-2031-4a02-819c-2c9cf720eba9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.220302] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 88053de1-3cc2-4776-a56e-b34aa0c93764 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.229722] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance cb76498a-b404-40f3-ac3f-93aea525abee has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.240895] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance fcae7119-6233-4a52-9e52-1147f2b10ddc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.241016] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 760.241148] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 760.598048] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b3f7c4c-17a6-49b9-808a-65f66eb969f9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 760.606307] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-676aedca-a907-460f-8954-eccfed346e86 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 760.635789] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8a8f352-9742-4613-bbf0-9fa1e829cf70 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 760.642813] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84c8f442-e2eb-44dd-b8ec-094922bd8f0a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 760.656577] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 760.667527] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 760.685532] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 760.685750] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.722s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 777.736054] env[67893]: WARNING oslo_vmware.rw_handles [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 777.736054] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 777.736054] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 777.736054] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 777.736054] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 777.736054] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 777.736054] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 777.736054] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 777.736054] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 777.736054] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 777.736054] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 777.736054] env[67893]: ERROR oslo_vmware.rw_handles [ 777.736054] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/ead30d55-41eb-43b5-acaa-aa57beff1aa9/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 777.737663] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 777.737899] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Copying Virtual Disk [datastore1] vmware_temp/ead30d55-41eb-43b5-acaa-aa57beff1aa9/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/ead30d55-41eb-43b5-acaa-aa57beff1aa9/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 777.738198] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-079de210-3afa-4976-852a-ab3efda281b2 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 777.747307] env[67893]: DEBUG oslo_vmware.api [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Waiting for the task: (returnval){ [ 777.747307] env[67893]: value = "task-3455344" [ 777.747307] env[67893]: _type = "Task" [ 777.747307] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 777.755414] env[67893]: DEBUG oslo_vmware.api [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Task: {'id': task-3455344, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 778.257500] env[67893]: DEBUG oslo_vmware.exceptions [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 778.257802] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 778.258373] env[67893]: ERROR nova.compute.manager [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 778.258373] env[67893]: Faults: ['InvalidArgument'] [ 778.258373] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Traceback (most recent call last): [ 778.258373] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 778.258373] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] yield resources [ 778.258373] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 778.258373] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] self.driver.spawn(context, instance, image_meta, [ 778.258373] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 778.258373] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 778.258373] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 778.258373] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] self._fetch_image_if_missing(context, vi) [ 778.258373] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 778.258730] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] image_cache(vi, tmp_image_ds_loc) [ 778.258730] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 778.258730] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] vm_util.copy_virtual_disk( [ 778.258730] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 778.258730] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] session._wait_for_task(vmdk_copy_task) [ 778.258730] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 778.258730] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] return self.wait_for_task(task_ref) [ 778.258730] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 778.258730] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] return evt.wait() [ 778.258730] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 778.258730] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] result = hub.switch() [ 778.258730] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 778.258730] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] return self.greenlet.switch() [ 778.259898] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 778.259898] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] self.f(*self.args, **self.kw) [ 778.259898] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 778.259898] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] raise exceptions.translate_fault(task_info.error) [ 778.259898] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 778.259898] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Faults: ['InvalidArgument'] [ 778.259898] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] [ 778.259898] env[67893]: INFO nova.compute.manager [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Terminating instance [ 778.260319] env[67893]: DEBUG oslo_concurrency.lockutils [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 778.260446] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 778.261061] env[67893]: DEBUG nova.compute.manager [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 778.261245] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 778.261466] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-58d6b1e4-c3d3-4c20-8206-a78c67a00be9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 778.263861] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3fd2831b-a92d-4851-805d-a7722cf6a45a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 778.270100] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 778.270302] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-a73650fb-992a-49d4-be85-1c9766d9bb7b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 778.273027] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 778.273027] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 778.273463] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2f2eb2a2-b089-4ef8-9628-6ebf44eef2b5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 778.278940] env[67893]: DEBUG oslo_vmware.api [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Waiting for the task: (returnval){ [ 778.278940] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]5228811c-939f-2225-ca94-f8f0b8e95ef9" [ 778.278940] env[67893]: _type = "Task" [ 778.278940] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 778.285751] env[67893]: DEBUG oslo_vmware.api [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]5228811c-939f-2225-ca94-f8f0b8e95ef9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 778.347238] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 778.347448] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 778.347630] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Deleting the datastore file [datastore1] d9e47a83-7921-4cf6-ba99-fb705bc52e4a {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 778.347928] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-84d9a51d-2d9f-4f37-819d-df5aca9f327f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 778.354408] env[67893]: DEBUG oslo_vmware.api [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Waiting for the task: (returnval){ [ 778.354408] env[67893]: value = "task-3455346" [ 778.354408] env[67893]: _type = "Task" [ 778.354408] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 778.362078] env[67893]: DEBUG oslo_vmware.api [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Task: {'id': task-3455346, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 778.789964] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 778.790254] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Creating directory with path [datastore1] vmware_temp/972acfb5-544e-4dd9-ae79-cb9b0720bb56/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 778.790505] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-59092a5d-3342-4983-8c5d-0c62031a9406 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 778.811873] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Created directory with path [datastore1] vmware_temp/972acfb5-544e-4dd9-ae79-cb9b0720bb56/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 778.812100] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Fetch image to [datastore1] vmware_temp/972acfb5-544e-4dd9-ae79-cb9b0720bb56/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 778.812278] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/972acfb5-544e-4dd9-ae79-cb9b0720bb56/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 778.813039] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-386ee9a3-e5b2-4298-bfbe-ba2fab55bbb0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 778.820064] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1104511-87cf-4062-b1ab-ede208a2c0aa {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 778.828819] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c46fda0d-554d-426c-91c6-85fa27a827e5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 778.861857] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b317993-2d21-4c95-aef6-4eb96b47064e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 778.868825] env[67893]: DEBUG oslo_vmware.api [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Task: {'id': task-3455346, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.078947} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 778.870343] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 778.870542] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 778.870716] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 778.870900] env[67893]: INFO nova.compute.manager [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Took 0.61 seconds to destroy the instance on the hypervisor. [ 778.872701] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-d5e3becf-5c55-4414-9dfa-2240335688fa {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 778.875033] env[67893]: DEBUG nova.compute.claims [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 778.875209] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 778.875422] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 778.894746] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 778.960609] env[67893]: DEBUG oslo_vmware.rw_handles [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/972acfb5-544e-4dd9-ae79-cb9b0720bb56/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 779.024311] env[67893]: DEBUG oslo_vmware.rw_handles [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 779.024495] env[67893]: DEBUG oslo_vmware.rw_handles [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/972acfb5-544e-4dd9-ae79-cb9b0720bb56/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 779.337470] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-273c7391-9cec-4213-a4c4-f074b159aa2d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 779.346363] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7304e04-4d5f-4f5f-a1eb-7956831b8eac {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 779.375062] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-458ecc9f-b743-4fe9-977a-5c5447de6179 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 779.381922] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b991eca5-5a7a-4a3d-aaff-86053d571a72 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 779.395076] env[67893]: DEBUG nova.compute.provider_tree [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 779.403621] env[67893]: DEBUG nova.scheduler.client.report [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 779.417993] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.542s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 779.418605] env[67893]: ERROR nova.compute.manager [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 779.418605] env[67893]: Faults: ['InvalidArgument'] [ 779.418605] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Traceback (most recent call last): [ 779.418605] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 779.418605] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] self.driver.spawn(context, instance, image_meta, [ 779.418605] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 779.418605] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 779.418605] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 779.418605] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] self._fetch_image_if_missing(context, vi) [ 779.418605] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 779.418605] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] image_cache(vi, tmp_image_ds_loc) [ 779.418605] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 779.419131] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] vm_util.copy_virtual_disk( [ 779.419131] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 779.419131] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] session._wait_for_task(vmdk_copy_task) [ 779.419131] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 779.419131] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] return self.wait_for_task(task_ref) [ 779.419131] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 779.419131] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] return evt.wait() [ 779.419131] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 779.419131] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] result = hub.switch() [ 779.419131] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 779.419131] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] return self.greenlet.switch() [ 779.419131] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 779.419131] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] self.f(*self.args, **self.kw) [ 779.419407] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 779.419407] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] raise exceptions.translate_fault(task_info.error) [ 779.419407] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 779.419407] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Faults: ['InvalidArgument'] [ 779.419407] env[67893]: ERROR nova.compute.manager [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] [ 779.419407] env[67893]: DEBUG nova.compute.utils [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 779.420696] env[67893]: DEBUG nova.compute.manager [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Build of instance d9e47a83-7921-4cf6-ba99-fb705bc52e4a was re-scheduled: A specified parameter was not correct: fileType [ 779.420696] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 779.421076] env[67893]: DEBUG nova.compute.manager [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 779.421248] env[67893]: DEBUG nova.compute.manager [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 779.421399] env[67893]: DEBUG nova.compute.manager [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 779.421562] env[67893]: DEBUG nova.network.neutron [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 779.725851] env[67893]: DEBUG nova.network.neutron [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 779.737143] env[67893]: INFO nova.compute.manager [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: d9e47a83-7921-4cf6-ba99-fb705bc52e4a] Took 0.32 seconds to deallocate network for instance. [ 779.844236] env[67893]: INFO nova.scheduler.client.report [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Deleted allocations for instance d9e47a83-7921-4cf6-ba99-fb705bc52e4a [ 779.868340] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a3c6e63a-9595-4093-b846-2371ca9fd43c tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Lock "d9e47a83-7921-4cf6-ba99-fb705bc52e4a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 153.697s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 779.882102] env[67893]: DEBUG nova.compute.manager [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 779.931418] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 779.931804] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 779.933243] env[67893]: INFO nova.compute.claims [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 780.314898] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0cf389de-4ff7-427d-bac9-935068748123 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 780.322636] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da3a3d37-110e-4b85-9b0d-69c0b3030e56 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 780.352679] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e416dced-f685-4b87-8325-59655ce5140e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 780.360189] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ad9198e-a340-4b28-a22a-78ed8cb266c0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 780.373652] env[67893]: DEBUG nova.compute.provider_tree [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 780.382496] env[67893]: DEBUG nova.scheduler.client.report [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 780.396356] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.465s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 780.396831] env[67893]: DEBUG nova.compute.manager [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 780.429074] env[67893]: DEBUG nova.compute.utils [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 780.430445] env[67893]: DEBUG nova.compute.manager [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Not allocating networking since 'none' was specified. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 780.442978] env[67893]: DEBUG nova.compute.manager [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 780.515023] env[67893]: DEBUG nova.compute.manager [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 780.544844] env[67893]: DEBUG nova.virt.hardware [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 780.545141] env[67893]: DEBUG nova.virt.hardware [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 780.545304] env[67893]: DEBUG nova.virt.hardware [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 780.548020] env[67893]: DEBUG nova.virt.hardware [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 780.548020] env[67893]: DEBUG nova.virt.hardware [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 780.548020] env[67893]: DEBUG nova.virt.hardware [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 780.548020] env[67893]: DEBUG nova.virt.hardware [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 780.548020] env[67893]: DEBUG nova.virt.hardware [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 780.548215] env[67893]: DEBUG nova.virt.hardware [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 780.548215] env[67893]: DEBUG nova.virt.hardware [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 780.548215] env[67893]: DEBUG nova.virt.hardware [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 780.548215] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12ff3e40-77d8-40e1-b5fd-d90aaa89f968 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 780.556873] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df48c3dc-64e0-42b7-8391-b1f7857befe4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 780.571291] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Instance VIF info [] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 780.578362] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Creating folder: Project (5f7694b008784ff2b21c60e662e216f0). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 780.578690] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f3144162-caac-421b-8b1e-3a9c4dd1afe4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 780.588473] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Created folder: Project (5f7694b008784ff2b21c60e662e216f0) in parent group-v689771. [ 780.588677] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Creating folder: Instances. Parent ref: group-v689813. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 780.588904] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-376a0303-6884-4c2e-a6f8-5bc7b04f5c60 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 780.597362] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Created folder: Instances in parent group-v689813. [ 780.597584] env[67893]: DEBUG oslo.service.loopingcall [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 780.597770] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 780.597957] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-2f789b45-e5c3-4ec5-abd9-6b13e4fb1933 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 780.615030] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 780.615030] env[67893]: value = "task-3455349" [ 780.615030] env[67893]: _type = "Task" [ 780.615030] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 780.621423] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455349, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 781.124727] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455349, 'name': CreateVM_Task, 'duration_secs': 0.241527} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 781.124984] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 781.125332] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 781.125494] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 781.125814] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 781.126096] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4e95ecae-f66e-447e-85a9-c304d45a66c1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 781.130309] env[67893]: DEBUG oslo_vmware.api [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Waiting for the task: (returnval){ [ 781.130309] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]5295da04-8bc0-63db-8cb4-af11b8cf20eb" [ 781.130309] env[67893]: _type = "Task" [ 781.130309] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 781.137406] env[67893]: DEBUG oslo_vmware.api [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]5295da04-8bc0-63db-8cb4-af11b8cf20eb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 781.420553] env[67893]: DEBUG oslo_concurrency.lockutils [None req-9782ff27-31db-4875-8f2e-a64ff6162396 tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Acquiring lock "8831483f-3fbb-4463-9f8f-868d46bb3e4e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 781.420553] env[67893]: DEBUG oslo_concurrency.lockutils [None req-9782ff27-31db-4875-8f2e-a64ff6162396 tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Lock "8831483f-3fbb-4463-9f8f-868d46bb3e4e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 781.640298] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 781.640618] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 781.640897] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 818.661936] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 818.854633] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 818.854633] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 818.876599] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 819.859108] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 819.859108] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 819.859108] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 819.881248] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 819.881391] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 819.881524] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 819.881650] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 819.881776] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 819.881898] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 819.882029] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 819.882154] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 819.882275] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 819.882392] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 819.882511] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 819.882997] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 819.883181] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 819.883338] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 819.896126] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 819.896342] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 819.896503] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 819.896680] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 819.898237] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36bf9dc2-cf1e-4ea6-8500-18a65fcc75ed {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 819.906762] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2e25a1e-b9b7-446f-a09a-bb751da8f470 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 819.922327] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fdcf07e-8f68-4aea-8229-21c59f67d656 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 819.929112] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52f11dac-3cc2-45ee-8b97-f507e5d6f9b6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 819.958024] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180991MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 819.958024] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 819.958024] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 820.031402] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 820.031570] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 820.031700] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 043c631c-bf15-4b4c-9a92-49ea51b6d405 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 820.031824] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 820.031943] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2f69fae8-d060-4156-8880-071f5ee1f969 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 820.032076] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2256af1c-4ff8-46b9-b568-c25ce8886e5f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 820.032197] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 6520080a-8bf1-4803-9099-87c3ba6e28e4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 820.032313] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 19ab9782-9131-46ba-bbf2-cc021953046e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 820.032429] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2eb8d698-9436-4e91-bd10-5f5200415144 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 820.032544] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 820.046501] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 820.056632] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a2a5e9bc-da8b-42df-9f5f-caf70d72cc0b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 820.066690] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2b03a2c3-33de-4fb4-b723-029652a7c780 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 820.076725] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 63254807-dead-415e-bdf6-e85780248d8f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 820.091287] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 194fe3b9-366e-4489-9b1f-2adf2a8ac6ee has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 820.110531] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance bac538d8-3dda-4851-8aa3-d60bae70b6ff has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 820.123140] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance d3019243-6b64-4d8f-87bb-ace791093969 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 820.135960] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance e7b5f3f7-5b2d-49c7-be13-3b481f1b3ca8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 820.154831] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 9be15a5a-2a28-412b-a893-387b8dd9a2c4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 820.166069] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 3d091507-3ab2-45da-a366-ff5d3f107134 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 820.175013] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 938d37dd-509b-4923-b192-3ce4a6d530c5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 820.188065] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 40c8659f-361a-4bf7-b16c-00bfc2c98729 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 820.199116] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a9656a7e-8a7b-489e-9990-097c1e93e535 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 820.208744] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance cb485828-0620-48fd-a9d4-a83e690f4675 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 820.219718] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance dd3d49f4-83c5-4a83-9674-fed5e190743c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 820.229497] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 9e3bbca4-2031-4a02-819c-2c9cf720eba9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 820.240490] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 88053de1-3cc2-4776-a56e-b34aa0c93764 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 820.252025] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance cb76498a-b404-40f3-ac3f-93aea525abee has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 820.262040] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance fcae7119-6233-4a52-9e52-1147f2b10ddc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 820.270764] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 8831483f-3fbb-4463-9f8f-868d46bb3e4e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 820.271097] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 820.271284] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 820.614905] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb2db58e-f468-4680-8221-e9fd929cbcfb {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 820.622468] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bea9792-047a-485d-a262-e305cafc7f9a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 820.651975] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe21a0f0-0dbf-4690-9873-efe63f7e890b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 820.659273] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb72b912-f328-4309-9dd7-65cfacc7701b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 820.672469] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 820.683458] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 820.696635] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 820.696887] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.739s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 821.673354] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 821.673667] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 821.673743] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 826.563183] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1e34feff-330a-45d8-a8a6-b7e5bb0dfc4d tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Acquiring lock "3e67c74f-5c03-4dc4-a23b-b547bfb32b4a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 827.751780] env[67893]: WARNING oslo_vmware.rw_handles [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 827.751780] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 827.751780] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 827.751780] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 827.751780] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 827.751780] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 827.751780] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 827.751780] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 827.751780] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 827.751780] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 827.751780] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 827.751780] env[67893]: ERROR oslo_vmware.rw_handles [ 827.752349] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/972acfb5-544e-4dd9-ae79-cb9b0720bb56/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 827.753866] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 827.754121] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Copying Virtual Disk [datastore1] vmware_temp/972acfb5-544e-4dd9-ae79-cb9b0720bb56/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/972acfb5-544e-4dd9-ae79-cb9b0720bb56/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 827.754401] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-5468a92a-9026-4fe3-b466-a5c8e05c3c32 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 827.763314] env[67893]: DEBUG oslo_vmware.api [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Waiting for the task: (returnval){ [ 827.763314] env[67893]: value = "task-3455350" [ 827.763314] env[67893]: _type = "Task" [ 827.763314] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 827.771349] env[67893]: DEBUG oslo_vmware.api [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Task: {'id': task-3455350, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 828.273153] env[67893]: DEBUG oslo_vmware.exceptions [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 828.273526] env[67893]: DEBUG oslo_concurrency.lockutils [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 828.274115] env[67893]: ERROR nova.compute.manager [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 828.274115] env[67893]: Faults: ['InvalidArgument'] [ 828.274115] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Traceback (most recent call last): [ 828.274115] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 828.274115] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] yield resources [ 828.274115] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 828.274115] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] self.driver.spawn(context, instance, image_meta, [ 828.274115] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 828.274115] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 828.274115] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 828.274115] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] self._fetch_image_if_missing(context, vi) [ 828.274115] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 828.274464] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] image_cache(vi, tmp_image_ds_loc) [ 828.274464] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 828.274464] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] vm_util.copy_virtual_disk( [ 828.274464] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 828.274464] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] session._wait_for_task(vmdk_copy_task) [ 828.274464] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 828.274464] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] return self.wait_for_task(task_ref) [ 828.274464] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 828.274464] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] return evt.wait() [ 828.274464] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 828.274464] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] result = hub.switch() [ 828.274464] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 828.274464] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] return self.greenlet.switch() [ 828.274825] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 828.274825] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] self.f(*self.args, **self.kw) [ 828.274825] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 828.274825] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] raise exceptions.translate_fault(task_info.error) [ 828.274825] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 828.274825] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Faults: ['InvalidArgument'] [ 828.274825] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] [ 828.274825] env[67893]: INFO nova.compute.manager [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Terminating instance [ 828.276304] env[67893]: DEBUG oslo_concurrency.lockutils [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 828.276572] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 828.277169] env[67893]: DEBUG oslo_concurrency.lockutils [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Acquiring lock "refresh_cache-61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 828.277365] env[67893]: DEBUG oslo_concurrency.lockutils [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Acquired lock "refresh_cache-61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 828.277559] env[67893]: DEBUG nova.network.neutron [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 828.278504] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-979f35ec-63c2-4fe8-b9d7-2775db6e05dd {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 828.286986] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 828.286986] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 828.288249] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-eedcf62c-897d-40e5-9ce3-e733d82d2dc9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 828.296101] env[67893]: DEBUG oslo_vmware.api [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Waiting for the task: (returnval){ [ 828.296101] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52ec8319-fe5a-b8fd-6ae6-b97a9c71397e" [ 828.296101] env[67893]: _type = "Task" [ 828.296101] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 828.307248] env[67893]: DEBUG oslo_vmware.api [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52ec8319-fe5a-b8fd-6ae6-b97a9c71397e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 828.386589] env[67893]: DEBUG nova.network.neutron [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 828.471056] env[67893]: DEBUG nova.network.neutron [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 828.481441] env[67893]: DEBUG oslo_concurrency.lockutils [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Releasing lock "refresh_cache-61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 828.481841] env[67893]: DEBUG nova.compute.manager [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 828.482050] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 828.483107] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-713a24c2-733c-4eb8-9188-767aff827cc6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 828.491441] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 828.491669] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-92afe9a3-19e7-47f6-9e6e-d95016dfb672 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 828.524965] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 828.525187] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 828.525395] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Deleting the datastore file [datastore1] 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3 {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 828.525613] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a46d8a73-2b04-4f46-9f52-5e24932de468 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 828.531383] env[67893]: DEBUG oslo_vmware.api [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Waiting for the task: (returnval){ [ 828.531383] env[67893]: value = "task-3455352" [ 828.531383] env[67893]: _type = "Task" [ 828.531383] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 828.538529] env[67893]: DEBUG oslo_vmware.api [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Task: {'id': task-3455352, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 828.805340] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 828.805604] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Creating directory with path [datastore1] vmware_temp/9b84f6e8-f1c3-4378-9c43-6c4c14989eed/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 828.805823] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cf53c2fd-bc42-44cf-92d0-d70484f3b0df {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 828.817870] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Created directory with path [datastore1] vmware_temp/9b84f6e8-f1c3-4378-9c43-6c4c14989eed/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 828.818113] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Fetch image to [datastore1] vmware_temp/9b84f6e8-f1c3-4378-9c43-6c4c14989eed/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 828.818296] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/9b84f6e8-f1c3-4378-9c43-6c4c14989eed/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 828.819082] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27fd9b94-f44f-427f-8177-063381194d7b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 828.825656] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99bdfb75-1c51-4740-b960-17c149706d5a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 828.836369] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8174eff4-1532-4914-9c00-27769d513a5b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 828.867183] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bda2bd1-f8ac-42b0-acdf-ed7e40f009c6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 828.872881] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-6606b3e1-9188-4ff0-9090-e185ffaf2c54 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 828.892898] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 829.042078] env[67893]: DEBUG oslo_vmware.api [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Task: {'id': task-3455352, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.047501} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 829.043140] env[67893]: DEBUG oslo_vmware.rw_handles [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9b84f6e8-f1c3-4378-9c43-6c4c14989eed/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 829.044663] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 829.044877] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 829.045060] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 829.045236] env[67893]: INFO nova.compute.manager [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Took 0.56 seconds to destroy the instance on the hypervisor. [ 829.045475] env[67893]: DEBUG oslo.service.loopingcall [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 829.098741] env[67893]: DEBUG nova.compute.manager [-] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Skipping network deallocation for instance since networking was not requested. {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 829.101021] env[67893]: DEBUG nova.compute.claims [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 829.101220] env[67893]: DEBUG oslo_concurrency.lockutils [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 829.101431] env[67893]: DEBUG oslo_concurrency.lockutils [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 829.105824] env[67893]: DEBUG oslo_vmware.rw_handles [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 829.105988] env[67893]: DEBUG oslo_vmware.rw_handles [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9b84f6e8-f1c3-4378-9c43-6c4c14989eed/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 829.518242] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9a44415-6439-4066-9c91-773e1148bfd8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 829.526032] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc4a9c87-3f1b-4c1e-8910-f6e65b795fb7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 829.556077] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c78cde81-3110-414c-b123-81c531742065 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 829.563743] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-996fa232-087c-42ec-8870-e15b04ae091c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 829.577587] env[67893]: DEBUG nova.compute.provider_tree [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 829.586236] env[67893]: DEBUG nova.scheduler.client.report [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 829.599501] env[67893]: DEBUG oslo_concurrency.lockutils [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.498s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 829.600027] env[67893]: ERROR nova.compute.manager [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 829.600027] env[67893]: Faults: ['InvalidArgument'] [ 829.600027] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Traceback (most recent call last): [ 829.600027] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 829.600027] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] self.driver.spawn(context, instance, image_meta, [ 829.600027] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 829.600027] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 829.600027] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 829.600027] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] self._fetch_image_if_missing(context, vi) [ 829.600027] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 829.600027] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] image_cache(vi, tmp_image_ds_loc) [ 829.600027] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 829.600423] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] vm_util.copy_virtual_disk( [ 829.600423] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 829.600423] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] session._wait_for_task(vmdk_copy_task) [ 829.600423] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 829.600423] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] return self.wait_for_task(task_ref) [ 829.600423] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 829.600423] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] return evt.wait() [ 829.600423] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 829.600423] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] result = hub.switch() [ 829.600423] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 829.600423] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] return self.greenlet.switch() [ 829.600423] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 829.600423] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] self.f(*self.args, **self.kw) [ 829.600764] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 829.600764] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] raise exceptions.translate_fault(task_info.error) [ 829.600764] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 829.600764] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Faults: ['InvalidArgument'] [ 829.600764] env[67893]: ERROR nova.compute.manager [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] [ 829.600764] env[67893]: DEBUG nova.compute.utils [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 829.601989] env[67893]: DEBUG nova.compute.manager [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Build of instance 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3 was re-scheduled: A specified parameter was not correct: fileType [ 829.601989] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 829.602374] env[67893]: DEBUG nova.compute.manager [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 829.602595] env[67893]: DEBUG oslo_concurrency.lockutils [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Acquiring lock "refresh_cache-61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 829.602741] env[67893]: DEBUG oslo_concurrency.lockutils [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Acquired lock "refresh_cache-61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 829.602899] env[67893]: DEBUG nova.network.neutron [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 829.627905] env[67893]: DEBUG nova.network.neutron [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 829.701701] env[67893]: DEBUG nova.network.neutron [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 829.711315] env[67893]: DEBUG oslo_concurrency.lockutils [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Releasing lock "refresh_cache-61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 829.711474] env[67893]: DEBUG nova.compute.manager [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 829.711663] env[67893]: DEBUG nova.compute.manager [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] [instance: 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3] Skipping network deallocation for instance since networking was not requested. {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 829.805805] env[67893]: INFO nova.scheduler.client.report [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Deleted allocations for instance 61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3 [ 829.820904] env[67893]: DEBUG oslo_concurrency.lockutils [None req-590c3e11-6a12-4b94-b468-70e1b338205e tempest-ServersAdmin275Test-547017154 tempest-ServersAdmin275Test-547017154-project-member] Lock "61f9c4d7-0e4f-4d1d-af9d-ceb0aac42cd3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 192.413s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 829.839730] env[67893]: DEBUG nova.compute.manager [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 829.891100] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 829.891336] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 829.892923] env[67893]: INFO nova.compute.claims [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 830.321614] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ef69c38-ca11-4243-a01d-f0f28b1f0457 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 830.329817] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f9a80d4-bc26-45da-a596-c984a4a5a002 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 830.360542] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6db0740-9908-4dd1-b416-5d10f49c360b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 830.368341] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4b25d3a-8d01-4d68-b610-8f5e9567ae31 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 830.382044] env[67893]: DEBUG nova.compute.provider_tree [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 830.392829] env[67893]: DEBUG nova.scheduler.client.report [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 830.406052] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.515s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 830.406584] env[67893]: DEBUG nova.compute.manager [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 830.446489] env[67893]: DEBUG nova.compute.utils [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 830.447741] env[67893]: DEBUG nova.compute.manager [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 830.447907] env[67893]: DEBUG nova.network.neutron [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 830.457662] env[67893]: DEBUG nova.compute.manager [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 830.507879] env[67893]: DEBUG nova.policy [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'eb342e4315d34663a14ba8b6064562d1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '222114505dcd49c78b1e234a0edfd6bf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 830.524582] env[67893]: DEBUG nova.compute.manager [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 830.551463] env[67893]: DEBUG nova.virt.hardware [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 830.551718] env[67893]: DEBUG nova.virt.hardware [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 830.551879] env[67893]: DEBUG nova.virt.hardware [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 830.552069] env[67893]: DEBUG nova.virt.hardware [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 830.552221] env[67893]: DEBUG nova.virt.hardware [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 830.552361] env[67893]: DEBUG nova.virt.hardware [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 830.552571] env[67893]: DEBUG nova.virt.hardware [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 830.552731] env[67893]: DEBUG nova.virt.hardware [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 830.552899] env[67893]: DEBUG nova.virt.hardware [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 830.553073] env[67893]: DEBUG nova.virt.hardware [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 830.553247] env[67893]: DEBUG nova.virt.hardware [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 830.554114] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a156c38b-790c-4a4b-9cca-97f1ac3e0fc8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 830.562193] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f61097e0-67be-4eb7-b872-e4b89410fe41 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 830.970794] env[67893]: DEBUG nova.network.neutron [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Successfully created port: 24494b22-7fb2-4896-b53e-c438f5928fbe {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 831.820244] env[67893]: DEBUG nova.network.neutron [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Successfully updated port: 24494b22-7fb2-4896-b53e-c438f5928fbe {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 831.827895] env[67893]: DEBUG nova.compute.manager [req-f95c52a0-e9fe-475f-9702-0e9a438facab req-84de95cc-583e-4ec7-8725-53cdf7ce1736 service nova] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Received event network-vif-plugged-24494b22-7fb2-4896-b53e-c438f5928fbe {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 831.828206] env[67893]: DEBUG oslo_concurrency.lockutils [req-f95c52a0-e9fe-475f-9702-0e9a438facab req-84de95cc-583e-4ec7-8725-53cdf7ce1736 service nova] Acquiring lock "c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 831.830532] env[67893]: DEBUG oslo_concurrency.lockutils [req-f95c52a0-e9fe-475f-9702-0e9a438facab req-84de95cc-583e-4ec7-8725-53cdf7ce1736 service nova] Lock "c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 831.830532] env[67893]: DEBUG oslo_concurrency.lockutils [req-f95c52a0-e9fe-475f-9702-0e9a438facab req-84de95cc-583e-4ec7-8725-53cdf7ce1736 service nova] Lock "c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 831.830532] env[67893]: DEBUG nova.compute.manager [req-f95c52a0-e9fe-475f-9702-0e9a438facab req-84de95cc-583e-4ec7-8725-53cdf7ce1736 service nova] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] No waiting events found dispatching network-vif-plugged-24494b22-7fb2-4896-b53e-c438f5928fbe {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 831.830532] env[67893]: WARNING nova.compute.manager [req-f95c52a0-e9fe-475f-9702-0e9a438facab req-84de95cc-583e-4ec7-8725-53cdf7ce1736 service nova] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Received unexpected event network-vif-plugged-24494b22-7fb2-4896-b53e-c438f5928fbe for instance with vm_state building and task_state spawning. [ 831.839831] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Acquiring lock "refresh_cache-c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 831.840630] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Acquired lock "refresh_cache-c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 831.842660] env[67893]: DEBUG nova.network.neutron [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 831.887592] env[67893]: DEBUG nova.network.neutron [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 832.142603] env[67893]: DEBUG nova.network.neutron [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Updating instance_info_cache with network_info: [{"id": "24494b22-7fb2-4896-b53e-c438f5928fbe", "address": "fa:16:3e:a7:e7:25", "network": {"id": "f8432b0e-40e4-4385-8394-3dad48702bf3", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-768315130-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "222114505dcd49c78b1e234a0edfd6bf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a15de394-0367-4921-a5c1-6ac8615e3283", "external-id": "nsx-vlan-transportzone-13", "segmentation_id": 13, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap24494b22-7f", "ovs_interfaceid": "24494b22-7fb2-4896-b53e-c438f5928fbe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 832.157638] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Releasing lock "refresh_cache-c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 832.157922] env[67893]: DEBUG nova.compute.manager [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Instance network_info: |[{"id": "24494b22-7fb2-4896-b53e-c438f5928fbe", "address": "fa:16:3e:a7:e7:25", "network": {"id": "f8432b0e-40e4-4385-8394-3dad48702bf3", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-768315130-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "222114505dcd49c78b1e234a0edfd6bf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a15de394-0367-4921-a5c1-6ac8615e3283", "external-id": "nsx-vlan-transportzone-13", "segmentation_id": 13, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap24494b22-7f", "ovs_interfaceid": "24494b22-7fb2-4896-b53e-c438f5928fbe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 832.158318] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a7:e7:25', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a15de394-0367-4921-a5c1-6ac8615e3283', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '24494b22-7fb2-4896-b53e-c438f5928fbe', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 832.165772] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Creating folder: Project (222114505dcd49c78b1e234a0edfd6bf). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 832.166676] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-07620930-2b62-47f0-bbbb-a6afbdbb26d4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 832.177384] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Created folder: Project (222114505dcd49c78b1e234a0edfd6bf) in parent group-v689771. [ 832.177577] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Creating folder: Instances. Parent ref: group-v689816. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 832.177806] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6c3ab0e4-b654-40a3-a56e-97dd74b792fc {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 832.187981] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Created folder: Instances in parent group-v689816. [ 832.188226] env[67893]: DEBUG oslo.service.loopingcall [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 832.188406] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 832.188620] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-662cf900-c414-44c3-a586-b00bac8c049e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 832.208072] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 832.208072] env[67893]: value = "task-3455355" [ 832.208072] env[67893]: _type = "Task" [ 832.208072] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 832.215725] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455355, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 832.718628] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455355, 'name': CreateVM_Task, 'duration_secs': 0.334985} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 832.718788] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 832.719486] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 832.719721] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 832.719969] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 832.720219] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-21714fe0-e9fb-4f83-ab31-a03e09be903e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 832.724498] env[67893]: DEBUG oslo_vmware.api [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Waiting for the task: (returnval){ [ 832.724498] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52070e62-1c7e-c35d-ade2-4ce4777e7bf4" [ 832.724498] env[67893]: _type = "Task" [ 832.724498] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 832.732206] env[67893]: DEBUG oslo_vmware.api [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52070e62-1c7e-c35d-ade2-4ce4777e7bf4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 833.235478] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 833.235788] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 833.236203] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 833.949131] env[67893]: DEBUG nova.compute.manager [req-2dd33080-a43e-46b2-b63b-947c51336e17 req-ae4fdffd-3a34-4cf0-91d8-515e2cb3926b service nova] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Received event network-changed-24494b22-7fb2-4896-b53e-c438f5928fbe {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 833.949497] env[67893]: DEBUG nova.compute.manager [req-2dd33080-a43e-46b2-b63b-947c51336e17 req-ae4fdffd-3a34-4cf0-91d8-515e2cb3926b service nova] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Refreshing instance network info cache due to event network-changed-24494b22-7fb2-4896-b53e-c438f5928fbe. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 833.949632] env[67893]: DEBUG oslo_concurrency.lockutils [req-2dd33080-a43e-46b2-b63b-947c51336e17 req-ae4fdffd-3a34-4cf0-91d8-515e2cb3926b service nova] Acquiring lock "refresh_cache-c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 833.949822] env[67893]: DEBUG oslo_concurrency.lockutils [req-2dd33080-a43e-46b2-b63b-947c51336e17 req-ae4fdffd-3a34-4cf0-91d8-515e2cb3926b service nova] Acquired lock "refresh_cache-c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 833.949991] env[67893]: DEBUG nova.network.neutron [req-2dd33080-a43e-46b2-b63b-947c51336e17 req-ae4fdffd-3a34-4cf0-91d8-515e2cb3926b service nova] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Refreshing network info cache for port 24494b22-7fb2-4896-b53e-c438f5928fbe {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 834.493941] env[67893]: DEBUG nova.network.neutron [req-2dd33080-a43e-46b2-b63b-947c51336e17 req-ae4fdffd-3a34-4cf0-91d8-515e2cb3926b service nova] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Updated VIF entry in instance network info cache for port 24494b22-7fb2-4896-b53e-c438f5928fbe. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 834.494432] env[67893]: DEBUG nova.network.neutron [req-2dd33080-a43e-46b2-b63b-947c51336e17 req-ae4fdffd-3a34-4cf0-91d8-515e2cb3926b service nova] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Updating instance_info_cache with network_info: [{"id": "24494b22-7fb2-4896-b53e-c438f5928fbe", "address": "fa:16:3e:a7:e7:25", "network": {"id": "f8432b0e-40e4-4385-8394-3dad48702bf3", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-768315130-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "222114505dcd49c78b1e234a0edfd6bf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a15de394-0367-4921-a5c1-6ac8615e3283", "external-id": "nsx-vlan-transportzone-13", "segmentation_id": 13, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap24494b22-7f", "ovs_interfaceid": "24494b22-7fb2-4896-b53e-c438f5928fbe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 834.505154] env[67893]: DEBUG oslo_concurrency.lockutils [req-2dd33080-a43e-46b2-b63b-947c51336e17 req-ae4fdffd-3a34-4cf0-91d8-515e2cb3926b service nova] Releasing lock "refresh_cache-c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 836.809123] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b6a4dab1-4cfc-40f1-876a-0d8b309b0cb1 tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Acquiring lock "043c631c-bf15-4b4c-9a92-49ea51b6d405" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 841.866726] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Acquiring lock "2553f3c0-0988-4e11-a138-7e5f71e71f48" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 841.867104] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Lock "2553f3c0-0988-4e11-a138-7e5f71e71f48" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 843.415164] env[67893]: DEBUG oslo_concurrency.lockutils [None req-26e9203a-f9a7-4684-a039-c28d70896566 tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Acquiring lock "96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 845.102927] env[67893]: DEBUG oslo_concurrency.lockutils [None req-99c49e91-ec40-4634-8ecd-d9fc6369c4cd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Acquiring lock "2f69fae8-d060-4156-8880-071f5ee1f969" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 849.310272] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d1fed5f6-c1f8-4bb0-9825-ba0b473b5a13 tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Acquiring lock "2256af1c-4ff8-46b9-b568-c25ce8886e5f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 850.655581] env[67893]: DEBUG oslo_concurrency.lockutils [None req-131384d3-fdb9-401c-ab57-79d17eaa2a94 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Acquiring lock "6520080a-8bf1-4803-9099-87c3ba6e28e4" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 855.727077] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c8cc79dd-eeb7-4ac8-98fd-5a13cab7f00d tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Acquiring lock "c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 855.732545] env[67893]: DEBUG oslo_concurrency.lockutils [None req-186ba630-f253-4bd4-9a5b-92c8fe71a02f tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Acquiring lock "2eb8d698-9436-4e91-bd10-5f5200415144" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 855.783026] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3078c700-ca6c-465a-82fb-b854d5091124 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Acquiring lock "5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 861.365755] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Acquiring lock "c05df6c1-e4c9-4276-9981-e80e584d540c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 861.366080] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Lock "c05df6c1-e4c9-4276-9981-e80e584d540c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 870.632622] env[67893]: DEBUG oslo_concurrency.lockutils [None req-905c3871-724a-4a60-b48e-9dd9d69b26de tempest-ServerShowV247Test-307814888 tempest-ServerShowV247Test-307814888-project-member] Acquiring lock "89e2963e-83e2-4e29-843d-7c15abdf78bc" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 870.633548] env[67893]: DEBUG oslo_concurrency.lockutils [None req-905c3871-724a-4a60-b48e-9dd9d69b26de tempest-ServerShowV247Test-307814888 tempest-ServerShowV247Test-307814888-project-member] Lock "89e2963e-83e2-4e29-843d-7c15abdf78bc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 872.566306] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7beee04a-5161-4269-b89d-15109176b7f5 tempest-ServerShowV247Test-307814888 tempest-ServerShowV247Test-307814888-project-member] Acquiring lock "88a48088-829d-40c1-85e1-6e78b8f5cea9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 872.566617] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7beee04a-5161-4269-b89d-15109176b7f5 tempest-ServerShowV247Test-307814888 tempest-ServerShowV247Test-307814888-project-member] Lock "88a48088-829d-40c1-85e1-6e78b8f5cea9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 876.859386] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 876.861670] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Cleaning up deleted instances {{(pid=67893) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 876.874047] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] There are 0 instances to clean {{(pid=67893) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 876.875375] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 876.875375] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Cleaning up deleted instances with incomplete migration {{(pid=67893) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 876.891121] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 877.541725] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Acquiring lock "5a24adaf-bced-4488-9ccb-fc996b2ba154" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 877.541725] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Lock "5a24adaf-bced-4488-9ccb-fc996b2ba154" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 877.772112] env[67893]: WARNING oslo_vmware.rw_handles [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 877.772112] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 877.772112] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 877.772112] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 877.772112] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 877.772112] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 877.772112] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 877.772112] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 877.772112] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 877.772112] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 877.772112] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 877.772112] env[67893]: ERROR oslo_vmware.rw_handles [ 877.774775] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/9b84f6e8-f1c3-4378-9c43-6c4c14989eed/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 877.779380] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 877.779380] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Copying Virtual Disk [datastore1] vmware_temp/9b84f6e8-f1c3-4378-9c43-6c4c14989eed/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/9b84f6e8-f1c3-4378-9c43-6c4c14989eed/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 877.779380] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-634b7f34-2747-49ef-8142-e6ee3835a6fc {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 877.789302] env[67893]: DEBUG oslo_vmware.api [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Waiting for the task: (returnval){ [ 877.789302] env[67893]: value = "task-3455356" [ 877.789302] env[67893]: _type = "Task" [ 877.789302] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 877.800240] env[67893]: DEBUG oslo_vmware.api [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Task: {'id': task-3455356, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 877.901492] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 878.307416] env[67893]: DEBUG oslo_vmware.exceptions [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 878.307775] env[67893]: DEBUG oslo_concurrency.lockutils [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 878.308412] env[67893]: ERROR nova.compute.manager [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 878.308412] env[67893]: Faults: ['InvalidArgument'] [ 878.308412] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Traceback (most recent call last): [ 878.308412] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 878.308412] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] yield resources [ 878.308412] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 878.308412] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] self.driver.spawn(context, instance, image_meta, [ 878.308412] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 878.308412] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 878.308412] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 878.308412] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] self._fetch_image_if_missing(context, vi) [ 878.308412] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 878.308798] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] image_cache(vi, tmp_image_ds_loc) [ 878.308798] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 878.308798] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] vm_util.copy_virtual_disk( [ 878.308798] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 878.308798] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] session._wait_for_task(vmdk_copy_task) [ 878.308798] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 878.308798] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] return self.wait_for_task(task_ref) [ 878.308798] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 878.308798] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] return evt.wait() [ 878.308798] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 878.308798] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] result = hub.switch() [ 878.308798] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 878.308798] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] return self.greenlet.switch() [ 878.309141] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 878.309141] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] self.f(*self.args, **self.kw) [ 878.309141] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 878.309141] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] raise exceptions.translate_fault(task_info.error) [ 878.309141] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 878.309141] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Faults: ['InvalidArgument'] [ 878.309141] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] [ 878.309141] env[67893]: INFO nova.compute.manager [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Terminating instance [ 878.310914] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 878.311341] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 878.311976] env[67893]: DEBUG nova.compute.manager [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 878.312469] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 878.312741] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-42d1dae8-80b4-4e3b-a9fa-3fb52e6467d5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 878.315404] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0dec5cf2-51ce-4756-8828-a6d88014f9f1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 878.323519] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 878.323829] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c3449e53-473c-462c-a8d2-693a139e5de5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 878.326609] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 878.326823] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 878.328260] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d2021825-91af-4cb9-8624-438c07ca9ee1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 878.334684] env[67893]: DEBUG oslo_vmware.api [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Waiting for the task: (returnval){ [ 878.334684] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52984a21-c9f5-0f55-f72f-e3f98f33658a" [ 878.334684] env[67893]: _type = "Task" [ 878.334684] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 878.349278] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 878.349549] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Creating directory with path [datastore1] vmware_temp/1eb78104-db46-41b4-b4ae-080690901cdf/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 878.349812] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4529445c-7ca5-412e-848b-8f1d8098584f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 878.362949] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Created directory with path [datastore1] vmware_temp/1eb78104-db46-41b4-b4ae-080690901cdf/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 878.363136] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Fetch image to [datastore1] vmware_temp/1eb78104-db46-41b4-b4ae-080690901cdf/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 878.363593] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/1eb78104-db46-41b4-b4ae-080690901cdf/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 878.364104] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5025183-ccca-450e-9b12-bc3adfb6629f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 878.374991] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b6b4bf3-ffa1-451d-9b61-d70780c9d8fa {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 878.385078] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55ac995a-a505-487f-abe9-48ce786dc7bb {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 878.423714] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1de38e2-aa6c-467c-a800-e4f0361bd616 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 878.427120] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 878.427120] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 878.427233] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Deleting the datastore file [datastore1] 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 878.427467] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-72debb68-62a6-442e-91e9-f29414ab9594 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 878.434349] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4be624aa-fb0b-4e79-bb78-dbe1997c4746 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 878.436426] env[67893]: DEBUG oslo_vmware.api [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Waiting for the task: (returnval){ [ 878.436426] env[67893]: value = "task-3455358" [ 878.436426] env[67893]: _type = "Task" [ 878.436426] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 878.445123] env[67893]: DEBUG oslo_vmware.api [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Task: {'id': task-3455358, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 878.523840] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 878.602114] env[67893]: DEBUG oslo_vmware.rw_handles [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1eb78104-db46-41b4-b4ae-080690901cdf/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 878.673038] env[67893]: DEBUG oslo_vmware.rw_handles [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 878.673038] env[67893]: DEBUG oslo_vmware.rw_handles [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1eb78104-db46-41b4-b4ae-080690901cdf/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 878.853822] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 878.858725] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 878.947438] env[67893]: DEBUG oslo_vmware.api [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Task: {'id': task-3455358, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072586} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 878.947960] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 878.947960] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 878.948145] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 878.948179] env[67893]: INFO nova.compute.manager [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Took 0.64 seconds to destroy the instance on the hypervisor. [ 878.950386] env[67893]: DEBUG nova.compute.claims [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 878.950561] env[67893]: DEBUG oslo_concurrency.lockutils [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 878.950773] env[67893]: DEBUG oslo_concurrency.lockutils [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 879.398160] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25539399-fc2d-48a5-b3c6-3f330d6b5c64 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 879.407299] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ea175f2-8ff4-4663-960f-4cf948a6b7fc {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 879.445339] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38054f0b-a465-4a35-87b8-6db875a0f190 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 879.453763] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a34c43c-f53c-41f1-aae7-23d65a5c1504 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 879.468339] env[67893]: DEBUG nova.compute.provider_tree [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 879.488158] env[67893]: DEBUG nova.scheduler.client.report [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 879.515846] env[67893]: DEBUG oslo_concurrency.lockutils [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.565s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 879.516589] env[67893]: ERROR nova.compute.manager [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 879.516589] env[67893]: Faults: ['InvalidArgument'] [ 879.516589] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Traceback (most recent call last): [ 879.516589] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 879.516589] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] self.driver.spawn(context, instance, image_meta, [ 879.516589] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 879.516589] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 879.516589] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 879.516589] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] self._fetch_image_if_missing(context, vi) [ 879.516589] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 879.516589] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] image_cache(vi, tmp_image_ds_loc) [ 879.516589] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 879.516976] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] vm_util.copy_virtual_disk( [ 879.516976] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 879.516976] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] session._wait_for_task(vmdk_copy_task) [ 879.516976] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 879.516976] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] return self.wait_for_task(task_ref) [ 879.516976] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 879.516976] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] return evt.wait() [ 879.516976] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 879.516976] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] result = hub.switch() [ 879.516976] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 879.516976] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] return self.greenlet.switch() [ 879.516976] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 879.516976] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] self.f(*self.args, **self.kw) [ 879.517382] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 879.517382] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] raise exceptions.translate_fault(task_info.error) [ 879.517382] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 879.517382] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Faults: ['InvalidArgument'] [ 879.517382] env[67893]: ERROR nova.compute.manager [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] [ 879.517382] env[67893]: DEBUG nova.compute.utils [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 879.521023] env[67893]: DEBUG nova.compute.manager [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Build of instance 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a was re-scheduled: A specified parameter was not correct: fileType [ 879.521023] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 879.521023] env[67893]: DEBUG nova.compute.manager [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 879.521023] env[67893]: DEBUG nova.compute.manager [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 879.521023] env[67893]: DEBUG nova.compute.manager [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 879.521316] env[67893]: DEBUG nova.network.neutron [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 880.143310] env[67893]: DEBUG nova.network.neutron [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 880.165374] env[67893]: INFO nova.compute.manager [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Took 0.65 seconds to deallocate network for instance. [ 880.318770] env[67893]: INFO nova.scheduler.client.report [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Deleted allocations for instance 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a [ 880.353234] env[67893]: DEBUG oslo_concurrency.lockutils [None req-fb9e0c3f-1ca2-4ad3-8636-63d415c89e1c tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Lock "3e67c74f-5c03-4dc4-a23b-b547bfb32b4a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 251.357s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 880.355823] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1e34feff-330a-45d8-a8a6-b7e5bb0dfc4d tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Lock "3e67c74f-5c03-4dc4-a23b-b547bfb32b4a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 53.793s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 880.356077] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1e34feff-330a-45d8-a8a6-b7e5bb0dfc4d tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Acquiring lock "3e67c74f-5c03-4dc4-a23b-b547bfb32b4a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 880.356278] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1e34feff-330a-45d8-a8a6-b7e5bb0dfc4d tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Lock "3e67c74f-5c03-4dc4-a23b-b547bfb32b4a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 880.356437] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1e34feff-330a-45d8-a8a6-b7e5bb0dfc4d tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Lock "3e67c74f-5c03-4dc4-a23b-b547bfb32b4a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 880.359534] env[67893]: INFO nova.compute.manager [None req-1e34feff-330a-45d8-a8a6-b7e5bb0dfc4d tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Terminating instance [ 880.361099] env[67893]: DEBUG nova.compute.manager [None req-1e34feff-330a-45d8-a8a6-b7e5bb0dfc4d tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 880.361346] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-1e34feff-330a-45d8-a8a6-b7e5bb0dfc4d tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 880.361814] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-70e88396-b545-4009-b229-de3591f3e7f8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 880.372734] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa9d4839-226a-4b10-901f-b5d06689b439 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 880.388097] env[67893]: DEBUG nova.compute.manager [None req-43a8cb85-2105-433c-a1df-9ce2d35f4f66 tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: a2a5e9bc-da8b-42df-9f5f-caf70d72cc0b] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 880.409826] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-1e34feff-330a-45d8-a8a6-b7e5bb0dfc4d tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a could not be found. [ 880.410058] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-1e34feff-330a-45d8-a8a6-b7e5bb0dfc4d tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 880.410241] env[67893]: INFO nova.compute.manager [None req-1e34feff-330a-45d8-a8a6-b7e5bb0dfc4d tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Took 0.05 seconds to destroy the instance on the hypervisor. [ 880.410553] env[67893]: DEBUG oslo.service.loopingcall [None req-1e34feff-330a-45d8-a8a6-b7e5bb0dfc4d tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 880.410827] env[67893]: DEBUG nova.compute.manager [-] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 880.410827] env[67893]: DEBUG nova.network.neutron [-] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 880.425447] env[67893]: DEBUG nova.compute.manager [None req-43a8cb85-2105-433c-a1df-9ce2d35f4f66 tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: a2a5e9bc-da8b-42df-9f5f-caf70d72cc0b] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 880.463459] env[67893]: DEBUG nova.network.neutron [-] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 880.476525] env[67893]: INFO nova.compute.manager [-] [instance: 3e67c74f-5c03-4dc4-a23b-b547bfb32b4a] Took 0.07 seconds to deallocate network for instance. [ 880.550666] env[67893]: DEBUG oslo_concurrency.lockutils [None req-43a8cb85-2105-433c-a1df-9ce2d35f4f66 tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Lock "a2a5e9bc-da8b-42df-9f5f-caf70d72cc0b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 219.014s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 880.572576] env[67893]: DEBUG nova.compute.manager [None req-ee288a15-7b64-45cb-a1c7-d5d253212f89 tempest-ServersAdminTestJSON-469467968 tempest-ServersAdminTestJSON-469467968-project-member] [instance: 2b03a2c3-33de-4fb4-b723-029652a7c780] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 880.608426] env[67893]: DEBUG nova.compute.manager [None req-ee288a15-7b64-45cb-a1c7-d5d253212f89 tempest-ServersAdminTestJSON-469467968 tempest-ServersAdminTestJSON-469467968-project-member] [instance: 2b03a2c3-33de-4fb4-b723-029652a7c780] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 880.639930] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b8c1a89c-287f-4e8d-bca0-ac6bb96df018 tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Acquiring lock "d8f75420-059d-4af1-8545-b5c4f67f4fe3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 880.640217] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b8c1a89c-287f-4e8d-bca0-ac6bb96df018 tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Lock "d8f75420-059d-4af1-8545-b5c4f67f4fe3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 880.661241] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ee288a15-7b64-45cb-a1c7-d5d253212f89 tempest-ServersAdminTestJSON-469467968 tempest-ServersAdminTestJSON-469467968-project-member] Lock "2b03a2c3-33de-4fb4-b723-029652a7c780" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 217.695s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 880.747508] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1e34feff-330a-45d8-a8a6-b7e5bb0dfc4d tempest-ServerExternalEventsTest-232543362 tempest-ServerExternalEventsTest-232543362-project-member] Lock "3e67c74f-5c03-4dc4-a23b-b547bfb32b4a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.392s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 880.748990] env[67893]: DEBUG nova.compute.manager [None req-d9567953-f6cd-40ac-b9fd-8dd06d10f4ba tempest-ServersWithSpecificFlavorTestJSON-1749998521 tempest-ServersWithSpecificFlavorTestJSON-1749998521-project-member] [instance: 63254807-dead-415e-bdf6-e85780248d8f] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 880.827870] env[67893]: DEBUG nova.compute.manager [None req-d9567953-f6cd-40ac-b9fd-8dd06d10f4ba tempest-ServersWithSpecificFlavorTestJSON-1749998521 tempest-ServersWithSpecificFlavorTestJSON-1749998521-project-member] [instance: 63254807-dead-415e-bdf6-e85780248d8f] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 880.858396] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 880.858552] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 880.858943] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 880.860864] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d9567953-f6cd-40ac-b9fd-8dd06d10f4ba tempest-ServersWithSpecificFlavorTestJSON-1749998521 tempest-ServersWithSpecificFlavorTestJSON-1749998521-project-member] Lock "63254807-dead-415e-bdf6-e85780248d8f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 216.957s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 880.883535] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 880.883820] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 880.884279] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 880.884279] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 880.884396] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 880.884535] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 880.884784] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 880.884972] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 880.885161] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 880.885295] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 880.885601] env[67893]: DEBUG nova.compute.manager [None req-14a3ca71-5c94-427c-a9f7-19e2dc8dc2b2 tempest-ServersAdminTestJSON-469467968 tempest-ServersAdminTestJSON-469467968-project-member] [instance: 194fe3b9-366e-4489-9b1f-2adf2a8ac6ee] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 880.888493] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 880.889149] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 880.889362] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 880.889580] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 880.902730] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 880.903259] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 880.903505] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 880.903760] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 880.904888] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f0eb826-8c1b-4c56-b569-3c36fa011dd3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 880.913569] env[67893]: DEBUG nova.compute.manager [None req-14a3ca71-5c94-427c-a9f7-19e2dc8dc2b2 tempest-ServersAdminTestJSON-469467968 tempest-ServersAdminTestJSON-469467968-project-member] [instance: 194fe3b9-366e-4489-9b1f-2adf2a8ac6ee] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 880.916309] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bd7091c-e96b-4675-9b30-3718bba28464 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 880.939856] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bae41c5-e4ab-4f13-8271-8720240c8ad3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 880.949772] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c53273bb-7dd8-4eb9-9581-afb9d60e781b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 880.955638] env[67893]: DEBUG oslo_concurrency.lockutils [None req-14a3ca71-5c94-427c-a9f7-19e2dc8dc2b2 tempest-ServersAdminTestJSON-469467968 tempest-ServersAdminTestJSON-469467968-project-member] Lock "194fe3b9-366e-4489-9b1f-2adf2a8ac6ee" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 215.875s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 880.983227] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180985MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 880.983423] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 880.983585] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 880.985391] env[67893]: DEBUG nova.compute.manager [None req-d8626e5f-8a43-4ebd-9be0-a622c7713b51 tempest-ServersTestBootFromVolume-1376030239 tempest-ServersTestBootFromVolume-1376030239-project-member] [instance: bac538d8-3dda-4851-8aa3-d60bae70b6ff] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 881.013766] env[67893]: DEBUG nova.compute.manager [None req-d8626e5f-8a43-4ebd-9be0-a622c7713b51 tempest-ServersTestBootFromVolume-1376030239 tempest-ServersTestBootFromVolume-1376030239-project-member] [instance: bac538d8-3dda-4851-8aa3-d60bae70b6ff] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 881.056654] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d8626e5f-8a43-4ebd-9be0-a622c7713b51 tempest-ServersTestBootFromVolume-1376030239 tempest-ServersTestBootFromVolume-1376030239-project-member] Lock "bac538d8-3dda-4851-8aa3-d60bae70b6ff" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 215.621s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 881.074950] env[67893]: DEBUG nova.compute.manager [None req-64b6f36b-2c03-4c72-8c06-10aca793c471 tempest-ServerAddressesTestJSON-1960767209 tempest-ServerAddressesTestJSON-1960767209-project-member] [instance: d3019243-6b64-4d8f-87bb-ace791093969] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 881.078502] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 043c631c-bf15-4b4c-9a92-49ea51b6d405 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 881.079133] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 881.080754] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2f69fae8-d060-4156-8880-071f5ee1f969 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 881.080754] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2256af1c-4ff8-46b9-b568-c25ce8886e5f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 881.080852] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 6520080a-8bf1-4803-9099-87c3ba6e28e4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 881.080900] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 19ab9782-9131-46ba-bbf2-cc021953046e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 881.081036] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2eb8d698-9436-4e91-bd10-5f5200415144 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 881.081154] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 881.081264] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 881.097474] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a9656a7e-8a7b-489e-9990-097c1e93e535 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 881.112775] env[67893]: DEBUG nova.compute.manager [None req-64b6f36b-2c03-4c72-8c06-10aca793c471 tempest-ServerAddressesTestJSON-1960767209 tempest-ServerAddressesTestJSON-1960767209-project-member] [instance: d3019243-6b64-4d8f-87bb-ace791093969] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 881.117403] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance cb485828-0620-48fd-a9d4-a83e690f4675 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 881.132927] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance dd3d49f4-83c5-4a83-9674-fed5e190743c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 881.138161] env[67893]: DEBUG oslo_concurrency.lockutils [None req-64b6f36b-2c03-4c72-8c06-10aca793c471 tempest-ServerAddressesTestJSON-1960767209 tempest-ServerAddressesTestJSON-1960767209-project-member] Lock "d3019243-6b64-4d8f-87bb-ace791093969" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.936s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 881.145967] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 9e3bbca4-2031-4a02-819c-2c9cf720eba9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 881.150354] env[67893]: DEBUG nova.compute.manager [None req-d9211127-9ace-48b8-aceb-69ccf40d9ddf tempest-ServerGroupTestJSON-1928902997 tempest-ServerGroupTestJSON-1928902997-project-member] [instance: e7b5f3f7-5b2d-49c7-be13-3b481f1b3ca8] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 881.156685] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 88053de1-3cc2-4776-a56e-b34aa0c93764 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 881.168440] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance cb76498a-b404-40f3-ac3f-93aea525abee has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 881.182554] env[67893]: DEBUG nova.compute.manager [None req-d9211127-9ace-48b8-aceb-69ccf40d9ddf tempest-ServerGroupTestJSON-1928902997 tempest-ServerGroupTestJSON-1928902997-project-member] [instance: e7b5f3f7-5b2d-49c7-be13-3b481f1b3ca8] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 881.190259] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance fcae7119-6233-4a52-9e52-1147f2b10ddc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 881.204101] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 8831483f-3fbb-4463-9f8f-868d46bb3e4e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 881.207386] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d9211127-9ace-48b8-aceb-69ccf40d9ddf tempest-ServerGroupTestJSON-1928902997 tempest-ServerGroupTestJSON-1928902997-project-member] Lock "e7b5f3f7-5b2d-49c7-be13-3b481f1b3ca8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 211.597s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 881.220782] env[67893]: DEBUG nova.compute.manager [None req-7d5bc7fa-334c-4f67-803a-7f02f1a95868 tempest-ServerDiagnosticsTest-519189886 tempest-ServerDiagnosticsTest-519189886-project-member] [instance: 9be15a5a-2a28-412b-a893-387b8dd9a2c4] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 881.227024] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2553f3c0-0988-4e11-a138-7e5f71e71f48 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 881.236078] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c05df6c1-e4c9-4276-9981-e80e584d540c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 881.251139] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 89e2963e-83e2-4e29-843d-7c15abdf78bc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 881.258055] env[67893]: DEBUG nova.compute.manager [None req-7d5bc7fa-334c-4f67-803a-7f02f1a95868 tempest-ServerDiagnosticsTest-519189886 tempest-ServerDiagnosticsTest-519189886-project-member] [instance: 9be15a5a-2a28-412b-a893-387b8dd9a2c4] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 881.266466] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 88a48088-829d-40c1-85e1-6e78b8f5cea9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 881.282107] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5a24adaf-bced-4488-9ccb-fc996b2ba154 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 881.294785] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7d5bc7fa-334c-4f67-803a-7f02f1a95868 tempest-ServerDiagnosticsTest-519189886 tempest-ServerDiagnosticsTest-519189886-project-member] Lock "9be15a5a-2a28-412b-a893-387b8dd9a2c4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 210.640s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 881.302080] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance d8f75420-059d-4af1-8545-b5c4f67f4fe3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 881.302502] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 881.302502] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 881.309040] env[67893]: DEBUG nova.compute.manager [None req-59c8ebbd-5e3a-4e0c-9fac-742b627af296 tempest-AttachInterfacesUnderV243Test-1747987963 tempest-AttachInterfacesUnderV243Test-1747987963-project-member] [instance: 3d091507-3ab2-45da-a366-ff5d3f107134] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 881.340086] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Refreshing inventories for resource provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 881.345342] env[67893]: DEBUG nova.compute.manager [None req-59c8ebbd-5e3a-4e0c-9fac-742b627af296 tempest-AttachInterfacesUnderV243Test-1747987963 tempest-AttachInterfacesUnderV243Test-1747987963-project-member] [instance: 3d091507-3ab2-45da-a366-ff5d3f107134] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 881.388618] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Updating ProviderTree inventory for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 881.388836] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Updating inventory in ProviderTree for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 881.400581] env[67893]: DEBUG oslo_concurrency.lockutils [None req-59c8ebbd-5e3a-4e0c-9fac-742b627af296 tempest-AttachInterfacesUnderV243Test-1747987963 tempest-AttachInterfacesUnderV243Test-1747987963-project-member] Lock "3d091507-3ab2-45da-a366-ff5d3f107134" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 209.756s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 881.410183] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Refreshing aggregate associations for resource provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57, aggregates: None {{(pid=67893) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 881.428814] env[67893]: DEBUG nova.compute.manager [None req-f609ccc1-50ff-441e-8db1-7ff784a0ec32 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 938d37dd-509b-4923-b192-3ce4a6d530c5] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 881.456381] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Refreshing trait associations for resource provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO {{(pid=67893) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 881.462400] env[67893]: DEBUG nova.compute.manager [None req-f609ccc1-50ff-441e-8db1-7ff784a0ec32 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 938d37dd-509b-4923-b192-3ce4a6d530c5] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 881.505697] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f609ccc1-50ff-441e-8db1-7ff784a0ec32 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Lock "938d37dd-509b-4923-b192-3ce4a6d530c5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 201.095s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 881.524996] env[67893]: DEBUG nova.compute.manager [None req-33158d07-2c7f-4ce0-8a5f-ecfb5afddfc5 tempest-ServerAddressesNegativeTestJSON-2054882907 tempest-ServerAddressesNegativeTestJSON-2054882907-project-member] [instance: 40c8659f-361a-4bf7-b16c-00bfc2c98729] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 881.557025] env[67893]: DEBUG nova.compute.manager [None req-33158d07-2c7f-4ce0-8a5f-ecfb5afddfc5 tempest-ServerAddressesNegativeTestJSON-2054882907 tempest-ServerAddressesNegativeTestJSON-2054882907-project-member] [instance: 40c8659f-361a-4bf7-b16c-00bfc2c98729] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 881.605679] env[67893]: DEBUG oslo_concurrency.lockutils [None req-33158d07-2c7f-4ce0-8a5f-ecfb5afddfc5 tempest-ServerAddressesNegativeTestJSON-2054882907 tempest-ServerAddressesNegativeTestJSON-2054882907-project-member] Lock "40c8659f-361a-4bf7-b16c-00bfc2c98729" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 199.163s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 881.627468] env[67893]: DEBUG nova.compute.manager [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 881.703383] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 881.867558] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4457f057-ffb5-458b-a900-c25a20a3c02a tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Acquiring lock "a6771d3c-90ff-4403-9124-e74d74256db8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 881.867997] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4457f057-ffb5-458b-a900-c25a20a3c02a tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Lock "a6771d3c-90ff-4403-9124-e74d74256db8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 881.886016] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dff7d6d2-8a52-4b77-843e-26d625115d42 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 881.893609] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d487a9b-8cd1-43b4-a5d7-8c40b518436f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 881.924983] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0193c03-bfde-4d4b-a33a-7f4a113c60fa {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 881.932902] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64e82da3-dc16-49fe-a732-e08c0bc54578 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 881.947526] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 881.958544] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 881.974475] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 881.974695] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.991s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 881.974935] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.273s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 881.976627] env[67893]: INFO nova.compute.claims [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 882.486066] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17b4a1f1-6d1c-4dc2-98e3-094e8184d5d3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 882.495337] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7dbbfbba-3a65-42d7-9b01-9d1c1cb58c06 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 882.533420] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6128e459-abe6-470f-9129-630dcd354eb1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 882.542671] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbe0b925-d4d4-49d1-a29b-69e02f2e268b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 882.557435] env[67893]: DEBUG nova.compute.provider_tree [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 882.567859] env[67893]: DEBUG nova.scheduler.client.report [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 882.589995] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.615s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 882.590644] env[67893]: DEBUG nova.compute.manager [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 882.649664] env[67893]: DEBUG nova.compute.utils [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 882.651778] env[67893]: DEBUG nova.compute.manager [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 882.651878] env[67893]: DEBUG nova.network.neutron [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 882.670290] env[67893]: DEBUG nova.compute.manager [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 882.777444] env[67893]: DEBUG nova.compute.manager [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 882.811989] env[67893]: DEBUG nova.virt.hardware [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 882.812466] env[67893]: DEBUG nova.virt.hardware [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 882.812466] env[67893]: DEBUG nova.virt.hardware [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 882.812587] env[67893]: DEBUG nova.virt.hardware [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 882.812718] env[67893]: DEBUG nova.virt.hardware [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 882.812857] env[67893]: DEBUG nova.virt.hardware [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 882.814862] env[67893]: DEBUG nova.virt.hardware [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 882.814862] env[67893]: DEBUG nova.virt.hardware [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 882.815025] env[67893]: DEBUG nova.virt.hardware [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 882.815198] env[67893]: DEBUG nova.virt.hardware [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 882.815794] env[67893]: DEBUG nova.virt.hardware [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 882.816589] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a48909d8-db61-468f-b1ef-de9f7268a25e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 882.825567] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b7aa82a-bea2-49b3-9ca1-db857748c146 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 882.915887] env[67893]: DEBUG nova.policy [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0d20047565d145d196625c614f7625a4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'eae3a1869c3e4085af6b6173de554844', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 882.949031] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 882.949031] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 884.369215] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3c3f1ce2-f468-46fe-8c53-10e731c88eef tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Acquiring lock "a9656a7e-8a7b-489e-9990-097c1e93e535" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 884.401680] env[67893]: DEBUG nova.network.neutron [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Successfully created port: ebeef082-0a4c-4d1a-b8ec-73f9be3f868a {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 885.950832] env[67893]: DEBUG nova.network.neutron [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Successfully created port: bfc3e439-4e18-43de-9ad9-220ea6d5862c {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 886.251130] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b0d74f6c-573e-418d-b9ce-7fdd3db9c6d3 tempest-ServerRescueTestJSON-870876670 tempest-ServerRescueTestJSON-870876670-project-member] Acquiring lock "993c926b-bdc5-4f7e-992a-aac8c658ea6c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 886.251130] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b0d74f6c-573e-418d-b9ce-7fdd3db9c6d3 tempest-ServerRescueTestJSON-870876670 tempest-ServerRescueTestJSON-870876670-project-member] Lock "993c926b-bdc5-4f7e-992a-aac8c658ea6c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 887.365854] env[67893]: DEBUG nova.network.neutron [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Successfully created port: 7e21750f-f830-4467-b4c2-9c6d8812f2ed {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 888.168993] env[67893]: DEBUG oslo_concurrency.lockutils [None req-9ecb397f-b783-4c03-8a0e-2c17cb092063 tempest-AttachInterfacesV270Test-1389174808 tempest-AttachInterfacesV270Test-1389174808-project-member] Acquiring lock "429fffb6-8355-419d-8cbb-a406d723802b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 888.168993] env[67893]: DEBUG oslo_concurrency.lockutils [None req-9ecb397f-b783-4c03-8a0e-2c17cb092063 tempest-AttachInterfacesV270Test-1389174808 tempest-AttachInterfacesV270Test-1389174808-project-member] Lock "429fffb6-8355-419d-8cbb-a406d723802b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 888.831982] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b8b0949c-6630-4b49-83ca-41242a24d5d3 tempest-ServerActionsTestJSON-763823941 tempest-ServerActionsTestJSON-763823941-project-member] Acquiring lock "8c9b4750-db4e-446b-b108-fc675c6f4c69" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 888.832286] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b8b0949c-6630-4b49-83ca-41242a24d5d3 tempest-ServerActionsTestJSON-763823941 tempest-ServerActionsTestJSON-763823941-project-member] Lock "8c9b4750-db4e-446b-b108-fc675c6f4c69" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 889.219109] env[67893]: DEBUG nova.network.neutron [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Successfully updated port: ebeef082-0a4c-4d1a-b8ec-73f9be3f868a {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 890.333161] env[67893]: DEBUG nova.compute.manager [req-8321310e-dd99-41c1-a269-916c0490a040 req-36451778-d9c3-46a6-a67d-6d1159776cb2 service nova] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Received event network-vif-plugged-ebeef082-0a4c-4d1a-b8ec-73f9be3f868a {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 890.333161] env[67893]: DEBUG oslo_concurrency.lockutils [req-8321310e-dd99-41c1-a269-916c0490a040 req-36451778-d9c3-46a6-a67d-6d1159776cb2 service nova] Acquiring lock "a9656a7e-8a7b-489e-9990-097c1e93e535-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 890.333161] env[67893]: DEBUG oslo_concurrency.lockutils [req-8321310e-dd99-41c1-a269-916c0490a040 req-36451778-d9c3-46a6-a67d-6d1159776cb2 service nova] Lock "a9656a7e-8a7b-489e-9990-097c1e93e535-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 890.333161] env[67893]: DEBUG oslo_concurrency.lockutils [req-8321310e-dd99-41c1-a269-916c0490a040 req-36451778-d9c3-46a6-a67d-6d1159776cb2 service nova] Lock "a9656a7e-8a7b-489e-9990-097c1e93e535-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 890.333505] env[67893]: DEBUG nova.compute.manager [req-8321310e-dd99-41c1-a269-916c0490a040 req-36451778-d9c3-46a6-a67d-6d1159776cb2 service nova] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] No waiting events found dispatching network-vif-plugged-ebeef082-0a4c-4d1a-b8ec-73f9be3f868a {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 890.333505] env[67893]: WARNING nova.compute.manager [req-8321310e-dd99-41c1-a269-916c0490a040 req-36451778-d9c3-46a6-a67d-6d1159776cb2 service nova] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Received unexpected event network-vif-plugged-ebeef082-0a4c-4d1a-b8ec-73f9be3f868a for instance with vm_state building and task_state deleting. [ 891.470902] env[67893]: DEBUG nova.network.neutron [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Successfully updated port: bfc3e439-4e18-43de-9ad9-220ea6d5862c {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 892.984530] env[67893]: DEBUG nova.network.neutron [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Successfully updated port: 7e21750f-f830-4467-b4c2-9c6d8812f2ed {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 893.005901] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Acquiring lock "refresh_cache-a9656a7e-8a7b-489e-9990-097c1e93e535" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 893.006070] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Acquired lock "refresh_cache-a9656a7e-8a7b-489e-9990-097c1e93e535" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 893.006223] env[67893]: DEBUG nova.network.neutron [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 893.065186] env[67893]: DEBUG nova.network.neutron [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 893.325804] env[67893]: DEBUG nova.compute.manager [req-38fc80a2-4a99-4820-98dc-bdfadfcddd61 req-4d58c090-f0da-485e-94a6-5a70066d731b service nova] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Received event network-changed-ebeef082-0a4c-4d1a-b8ec-73f9be3f868a {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 893.326011] env[67893]: DEBUG nova.compute.manager [req-38fc80a2-4a99-4820-98dc-bdfadfcddd61 req-4d58c090-f0da-485e-94a6-5a70066d731b service nova] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Refreshing instance network info cache due to event network-changed-ebeef082-0a4c-4d1a-b8ec-73f9be3f868a. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 893.326214] env[67893]: DEBUG oslo_concurrency.lockutils [req-38fc80a2-4a99-4820-98dc-bdfadfcddd61 req-4d58c090-f0da-485e-94a6-5a70066d731b service nova] Acquiring lock "refresh_cache-a9656a7e-8a7b-489e-9990-097c1e93e535" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 893.343386] env[67893]: DEBUG oslo_concurrency.lockutils [None req-688b9f69-20d0-48ac-bdd1-47e2e364bdd5 tempest-SecurityGroupsTestJSON-756338800 tempest-SecurityGroupsTestJSON-756338800-project-member] Acquiring lock "aa02f91b-b125-42af-be9a-c565ed041288" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 893.343641] env[67893]: DEBUG oslo_concurrency.lockutils [None req-688b9f69-20d0-48ac-bdd1-47e2e364bdd5 tempest-SecurityGroupsTestJSON-756338800 tempest-SecurityGroupsTestJSON-756338800-project-member] Lock "aa02f91b-b125-42af-be9a-c565ed041288" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 893.968461] env[67893]: DEBUG nova.network.neutron [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Updating instance_info_cache with network_info: [{"id": "ebeef082-0a4c-4d1a-b8ec-73f9be3f868a", "address": "fa:16:3e:ca:cb:8e", "network": {"id": "48f560b5-cbd0-46e1-905f-430d7fb4f3ab", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1516605202", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "eae3a1869c3e4085af6b6173de554844", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c3e0aae3-33d1-403b-bfaf-306f77a1422e", "external-id": "nsx-vlan-transportzone-211", "segmentation_id": 211, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapebeef082-0a", "ovs_interfaceid": "ebeef082-0a4c-4d1a-b8ec-73f9be3f868a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "bfc3e439-4e18-43de-9ad9-220ea6d5862c", "address": "fa:16:3e:a7:94:8c", "network": {"id": "a57bb317-d459-4fe9-a976-5cd91c514d8e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1507450088", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "eae3a1869c3e4085af6b6173de554844", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b71230ae-e879-4384-88ce-fe64c86fce22", "external-id": "nsx-vlan-transportzone-473", "segmentation_id": 473, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbfc3e439-4e", "ovs_interfaceid": "bfc3e439-4e18-43de-9ad9-220ea6d5862c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7e21750f-f830-4467-b4c2-9c6d8812f2ed", "address": "fa:16:3e:70:a9:f3", "network": {"id": "48f560b5-cbd0-46e1-905f-430d7fb4f3ab", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1516605202", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.120", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "eae3a1869c3e4085af6b6173de554844", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c3e0aae3-33d1-403b-bfaf-306f77a1422e", "external-id": "nsx-vlan-transportzone-211", "segmentation_id": 211, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7e21750f-f8", "ovs_interfaceid": "7e21750f-f830-4467-b4c2-9c6d8812f2ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 893.986527] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Releasing lock "refresh_cache-a9656a7e-8a7b-489e-9990-097c1e93e535" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 893.987286] env[67893]: DEBUG nova.compute.manager [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Instance network_info: |[{"id": "ebeef082-0a4c-4d1a-b8ec-73f9be3f868a", "address": "fa:16:3e:ca:cb:8e", "network": {"id": "48f560b5-cbd0-46e1-905f-430d7fb4f3ab", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1516605202", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "eae3a1869c3e4085af6b6173de554844", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c3e0aae3-33d1-403b-bfaf-306f77a1422e", "external-id": "nsx-vlan-transportzone-211", "segmentation_id": 211, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapebeef082-0a", "ovs_interfaceid": "ebeef082-0a4c-4d1a-b8ec-73f9be3f868a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "bfc3e439-4e18-43de-9ad9-220ea6d5862c", "address": "fa:16:3e:a7:94:8c", "network": {"id": "a57bb317-d459-4fe9-a976-5cd91c514d8e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1507450088", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "eae3a1869c3e4085af6b6173de554844", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b71230ae-e879-4384-88ce-fe64c86fce22", "external-id": "nsx-vlan-transportzone-473", "segmentation_id": 473, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbfc3e439-4e", "ovs_interfaceid": "bfc3e439-4e18-43de-9ad9-220ea6d5862c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7e21750f-f830-4467-b4c2-9c6d8812f2ed", "address": "fa:16:3e:70:a9:f3", "network": {"id": "48f560b5-cbd0-46e1-905f-430d7fb4f3ab", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1516605202", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.120", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "eae3a1869c3e4085af6b6173de554844", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c3e0aae3-33d1-403b-bfaf-306f77a1422e", "external-id": "nsx-vlan-transportzone-211", "segmentation_id": 211, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7e21750f-f8", "ovs_interfaceid": "7e21750f-f830-4467-b4c2-9c6d8812f2ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 893.987614] env[67893]: DEBUG oslo_concurrency.lockutils [req-38fc80a2-4a99-4820-98dc-bdfadfcddd61 req-4d58c090-f0da-485e-94a6-5a70066d731b service nova] Acquired lock "refresh_cache-a9656a7e-8a7b-489e-9990-097c1e93e535" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 893.987799] env[67893]: DEBUG nova.network.neutron [req-38fc80a2-4a99-4820-98dc-bdfadfcddd61 req-4d58c090-f0da-485e-94a6-5a70066d731b service nova] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Refreshing network info cache for port ebeef082-0a4c-4d1a-b8ec-73f9be3f868a {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 893.989523] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ca:cb:8e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c3e0aae3-33d1-403b-bfaf-306f77a1422e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ebeef082-0a4c-4d1a-b8ec-73f9be3f868a', 'vif_model': 'vmxnet3'}, {'network_name': 'br-int', 'mac_address': 'fa:16:3e:a7:94:8c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'b71230ae-e879-4384-88ce-fe64c86fce22', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'bfc3e439-4e18-43de-9ad9-220ea6d5862c', 'vif_model': 'vmxnet3'}, {'network_name': 'br-int', 'mac_address': 'fa:16:3e:70:a9:f3', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c3e0aae3-33d1-403b-bfaf-306f77a1422e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7e21750f-f830-4467-b4c2-9c6d8812f2ed', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 894.001032] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Creating folder: Project (eae3a1869c3e4085af6b6173de554844). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 894.002195] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-193860d3-b42e-412a-9a3b-62853aa1a1ed {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 894.015362] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Created folder: Project (eae3a1869c3e4085af6b6173de554844) in parent group-v689771. [ 894.015553] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Creating folder: Instances. Parent ref: group-v689819. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 894.015778] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-75b7b95d-b721-490c-88fe-df14b2df4c01 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 894.025062] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Created folder: Instances in parent group-v689819. [ 894.025303] env[67893]: DEBUG oslo.service.loopingcall [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 894.025481] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 894.025672] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5657b08c-230f-4c4f-bf3b-c48a8b7edaf1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 894.059048] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4dd29f0f-254b-4793-be89-17d9914c6169 tempest-VolumesAdminNegativeTest-1794643428 tempest-VolumesAdminNegativeTest-1794643428-project-member] Acquiring lock "e720d161-0c76-47ab-8d24-e465109d6e8c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 894.059302] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4dd29f0f-254b-4793-be89-17d9914c6169 tempest-VolumesAdminNegativeTest-1794643428 tempest-VolumesAdminNegativeTest-1794643428-project-member] Lock "e720d161-0c76-47ab-8d24-e465109d6e8c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 894.063622] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 894.063622] env[67893]: value = "task-3455361" [ 894.063622] env[67893]: _type = "Task" [ 894.063622] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 894.072779] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455361, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 894.391263] env[67893]: DEBUG nova.network.neutron [req-38fc80a2-4a99-4820-98dc-bdfadfcddd61 req-4d58c090-f0da-485e-94a6-5a70066d731b service nova] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Updated VIF entry in instance network info cache for port ebeef082-0a4c-4d1a-b8ec-73f9be3f868a. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 894.391770] env[67893]: DEBUG nova.network.neutron [req-38fc80a2-4a99-4820-98dc-bdfadfcddd61 req-4d58c090-f0da-485e-94a6-5a70066d731b service nova] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Updating instance_info_cache with network_info: [{"id": "ebeef082-0a4c-4d1a-b8ec-73f9be3f868a", "address": "fa:16:3e:ca:cb:8e", "network": {"id": "48f560b5-cbd0-46e1-905f-430d7fb4f3ab", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1516605202", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "eae3a1869c3e4085af6b6173de554844", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c3e0aae3-33d1-403b-bfaf-306f77a1422e", "external-id": "nsx-vlan-transportzone-211", "segmentation_id": 211, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapebeef082-0a", "ovs_interfaceid": "ebeef082-0a4c-4d1a-b8ec-73f9be3f868a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "bfc3e439-4e18-43de-9ad9-220ea6d5862c", "address": "fa:16:3e:a7:94:8c", "network": {"id": "a57bb317-d459-4fe9-a976-5cd91c514d8e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1507450088", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "eae3a1869c3e4085af6b6173de554844", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b71230ae-e879-4384-88ce-fe64c86fce22", "external-id": "nsx-vlan-transportzone-473", "segmentation_id": 473, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbfc3e439-4e", "ovs_interfaceid": "bfc3e439-4e18-43de-9ad9-220ea6d5862c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7e21750f-f830-4467-b4c2-9c6d8812f2ed", "address": "fa:16:3e:70:a9:f3", "network": {"id": "48f560b5-cbd0-46e1-905f-430d7fb4f3ab", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1516605202", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.120", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "eae3a1869c3e4085af6b6173de554844", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c3e0aae3-33d1-403b-bfaf-306f77a1422e", "external-id": "nsx-vlan-transportzone-211", "segmentation_id": 211, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7e21750f-f8", "ovs_interfaceid": "7e21750f-f830-4467-b4c2-9c6d8812f2ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 894.405113] env[67893]: DEBUG oslo_concurrency.lockutils [req-38fc80a2-4a99-4820-98dc-bdfadfcddd61 req-4d58c090-f0da-485e-94a6-5a70066d731b service nova] Releasing lock "refresh_cache-a9656a7e-8a7b-489e-9990-097c1e93e535" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 894.405113] env[67893]: DEBUG nova.compute.manager [req-38fc80a2-4a99-4820-98dc-bdfadfcddd61 req-4d58c090-f0da-485e-94a6-5a70066d731b service nova] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Received event network-vif-plugged-bfc3e439-4e18-43de-9ad9-220ea6d5862c {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 894.405113] env[67893]: DEBUG oslo_concurrency.lockutils [req-38fc80a2-4a99-4820-98dc-bdfadfcddd61 req-4d58c090-f0da-485e-94a6-5a70066d731b service nova] Acquiring lock "a9656a7e-8a7b-489e-9990-097c1e93e535-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 894.405113] env[67893]: DEBUG oslo_concurrency.lockutils [req-38fc80a2-4a99-4820-98dc-bdfadfcddd61 req-4d58c090-f0da-485e-94a6-5a70066d731b service nova] Lock "a9656a7e-8a7b-489e-9990-097c1e93e535-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 894.405113] env[67893]: DEBUG oslo_concurrency.lockutils [req-38fc80a2-4a99-4820-98dc-bdfadfcddd61 req-4d58c090-f0da-485e-94a6-5a70066d731b service nova] Lock "a9656a7e-8a7b-489e-9990-097c1e93e535-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 894.405113] env[67893]: DEBUG nova.compute.manager [req-38fc80a2-4a99-4820-98dc-bdfadfcddd61 req-4d58c090-f0da-485e-94a6-5a70066d731b service nova] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] No waiting events found dispatching network-vif-plugged-bfc3e439-4e18-43de-9ad9-220ea6d5862c {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 894.405113] env[67893]: WARNING nova.compute.manager [req-38fc80a2-4a99-4820-98dc-bdfadfcddd61 req-4d58c090-f0da-485e-94a6-5a70066d731b service nova] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Received unexpected event network-vif-plugged-bfc3e439-4e18-43de-9ad9-220ea6d5862c for instance with vm_state building and task_state deleting. [ 894.405113] env[67893]: DEBUG nova.compute.manager [req-38fc80a2-4a99-4820-98dc-bdfadfcddd61 req-4d58c090-f0da-485e-94a6-5a70066d731b service nova] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Received event network-changed-bfc3e439-4e18-43de-9ad9-220ea6d5862c {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 894.405113] env[67893]: DEBUG nova.compute.manager [req-38fc80a2-4a99-4820-98dc-bdfadfcddd61 req-4d58c090-f0da-485e-94a6-5a70066d731b service nova] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Refreshing instance network info cache due to event network-changed-bfc3e439-4e18-43de-9ad9-220ea6d5862c. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 894.405113] env[67893]: DEBUG oslo_concurrency.lockutils [req-38fc80a2-4a99-4820-98dc-bdfadfcddd61 req-4d58c090-f0da-485e-94a6-5a70066d731b service nova] Acquiring lock "refresh_cache-a9656a7e-8a7b-489e-9990-097c1e93e535" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 894.405113] env[67893]: DEBUG oslo_concurrency.lockutils [req-38fc80a2-4a99-4820-98dc-bdfadfcddd61 req-4d58c090-f0da-485e-94a6-5a70066d731b service nova] Acquired lock "refresh_cache-a9656a7e-8a7b-489e-9990-097c1e93e535" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 894.405113] env[67893]: DEBUG nova.network.neutron [req-38fc80a2-4a99-4820-98dc-bdfadfcddd61 req-4d58c090-f0da-485e-94a6-5a70066d731b service nova] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Refreshing network info cache for port bfc3e439-4e18-43de-9ad9-220ea6d5862c {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 894.580755] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455361, 'name': CreateVM_Task, 'duration_secs': 0.468096} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 894.580943] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 894.581859] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 894.582033] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 894.582339] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 894.582584] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8eea5fca-d2c5-44e4-9c4a-5a41ac7c4a1b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 894.589362] env[67893]: DEBUG oslo_vmware.api [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Waiting for the task: (returnval){ [ 894.589362] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]5222960e-592e-fd91-ef38-0fc2e337e80e" [ 894.589362] env[67893]: _type = "Task" [ 894.589362] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 894.598715] env[67893]: DEBUG oslo_vmware.api [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]5222960e-592e-fd91-ef38-0fc2e337e80e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 895.054507] env[67893]: DEBUG nova.network.neutron [req-38fc80a2-4a99-4820-98dc-bdfadfcddd61 req-4d58c090-f0da-485e-94a6-5a70066d731b service nova] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Updated VIF entry in instance network info cache for port bfc3e439-4e18-43de-9ad9-220ea6d5862c. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 895.055070] env[67893]: DEBUG nova.network.neutron [req-38fc80a2-4a99-4820-98dc-bdfadfcddd61 req-4d58c090-f0da-485e-94a6-5a70066d731b service nova] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Updating instance_info_cache with network_info: [{"id": "ebeef082-0a4c-4d1a-b8ec-73f9be3f868a", "address": "fa:16:3e:ca:cb:8e", "network": {"id": "48f560b5-cbd0-46e1-905f-430d7fb4f3ab", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1516605202", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "eae3a1869c3e4085af6b6173de554844", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c3e0aae3-33d1-403b-bfaf-306f77a1422e", "external-id": "nsx-vlan-transportzone-211", "segmentation_id": 211, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapebeef082-0a", "ovs_interfaceid": "ebeef082-0a4c-4d1a-b8ec-73f9be3f868a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "bfc3e439-4e18-43de-9ad9-220ea6d5862c", "address": "fa:16:3e:a7:94:8c", "network": {"id": "a57bb317-d459-4fe9-a976-5cd91c514d8e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1507450088", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "eae3a1869c3e4085af6b6173de554844", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b71230ae-e879-4384-88ce-fe64c86fce22", "external-id": "nsx-vlan-transportzone-473", "segmentation_id": 473, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbfc3e439-4e", "ovs_interfaceid": "bfc3e439-4e18-43de-9ad9-220ea6d5862c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7e21750f-f830-4467-b4c2-9c6d8812f2ed", "address": "fa:16:3e:70:a9:f3", "network": {"id": "48f560b5-cbd0-46e1-905f-430d7fb4f3ab", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1516605202", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.120", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "eae3a1869c3e4085af6b6173de554844", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c3e0aae3-33d1-403b-bfaf-306f77a1422e", "external-id": "nsx-vlan-transportzone-211", "segmentation_id": 211, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7e21750f-f8", "ovs_interfaceid": "7e21750f-f830-4467-b4c2-9c6d8812f2ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 895.071518] env[67893]: DEBUG oslo_concurrency.lockutils [req-38fc80a2-4a99-4820-98dc-bdfadfcddd61 req-4d58c090-f0da-485e-94a6-5a70066d731b service nova] Releasing lock "refresh_cache-a9656a7e-8a7b-489e-9990-097c1e93e535" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 895.101743] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 895.102416] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 895.103154] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 895.619412] env[67893]: DEBUG nova.compute.manager [req-c8eaf8ff-939e-44a9-9a13-82090188c183 req-f42faa8c-bdaa-4a3b-9a0d-a553c8c73658 service nova] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Received event network-vif-plugged-7e21750f-f830-4467-b4c2-9c6d8812f2ed {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 895.619412] env[67893]: DEBUG oslo_concurrency.lockutils [req-c8eaf8ff-939e-44a9-9a13-82090188c183 req-f42faa8c-bdaa-4a3b-9a0d-a553c8c73658 service nova] Acquiring lock "a9656a7e-8a7b-489e-9990-097c1e93e535-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 895.619697] env[67893]: DEBUG oslo_concurrency.lockutils [req-c8eaf8ff-939e-44a9-9a13-82090188c183 req-f42faa8c-bdaa-4a3b-9a0d-a553c8c73658 service nova] Lock "a9656a7e-8a7b-489e-9990-097c1e93e535-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 895.620018] env[67893]: DEBUG oslo_concurrency.lockutils [req-c8eaf8ff-939e-44a9-9a13-82090188c183 req-f42faa8c-bdaa-4a3b-9a0d-a553c8c73658 service nova] Lock "a9656a7e-8a7b-489e-9990-097c1e93e535-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 895.620257] env[67893]: DEBUG nova.compute.manager [req-c8eaf8ff-939e-44a9-9a13-82090188c183 req-f42faa8c-bdaa-4a3b-9a0d-a553c8c73658 service nova] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] No waiting events found dispatching network-vif-plugged-7e21750f-f830-4467-b4c2-9c6d8812f2ed {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 895.620405] env[67893]: WARNING nova.compute.manager [req-c8eaf8ff-939e-44a9-9a13-82090188c183 req-f42faa8c-bdaa-4a3b-9a0d-a553c8c73658 service nova] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Received unexpected event network-vif-plugged-7e21750f-f830-4467-b4c2-9c6d8812f2ed for instance with vm_state building and task_state deleting. [ 895.620559] env[67893]: DEBUG nova.compute.manager [req-c8eaf8ff-939e-44a9-9a13-82090188c183 req-f42faa8c-bdaa-4a3b-9a0d-a553c8c73658 service nova] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Received event network-changed-7e21750f-f830-4467-b4c2-9c6d8812f2ed {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 895.620708] env[67893]: DEBUG nova.compute.manager [req-c8eaf8ff-939e-44a9-9a13-82090188c183 req-f42faa8c-bdaa-4a3b-9a0d-a553c8c73658 service nova] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Refreshing instance network info cache due to event network-changed-7e21750f-f830-4467-b4c2-9c6d8812f2ed. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 895.620963] env[67893]: DEBUG oslo_concurrency.lockutils [req-c8eaf8ff-939e-44a9-9a13-82090188c183 req-f42faa8c-bdaa-4a3b-9a0d-a553c8c73658 service nova] Acquiring lock "refresh_cache-a9656a7e-8a7b-489e-9990-097c1e93e535" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 895.621084] env[67893]: DEBUG oslo_concurrency.lockutils [req-c8eaf8ff-939e-44a9-9a13-82090188c183 req-f42faa8c-bdaa-4a3b-9a0d-a553c8c73658 service nova] Acquired lock "refresh_cache-a9656a7e-8a7b-489e-9990-097c1e93e535" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 895.621288] env[67893]: DEBUG nova.network.neutron [req-c8eaf8ff-939e-44a9-9a13-82090188c183 req-f42faa8c-bdaa-4a3b-9a0d-a553c8c73658 service nova] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Refreshing network info cache for port 7e21750f-f830-4467-b4c2-9c6d8812f2ed {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 895.800816] env[67893]: DEBUG oslo_concurrency.lockutils [None req-e53cb8c5-4a11-461d-8895-82fc1b757527 tempest-ServerDiagnosticsV248Test-879533068 tempest-ServerDiagnosticsV248Test-879533068-project-member] Acquiring lock "5e7e68fb-f8a5-46c9-b0b1-9fbc96c82428" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 895.801121] env[67893]: DEBUG oslo_concurrency.lockutils [None req-e53cb8c5-4a11-461d-8895-82fc1b757527 tempest-ServerDiagnosticsV248Test-879533068 tempest-ServerDiagnosticsV248Test-879533068-project-member] Lock "5e7e68fb-f8a5-46c9-b0b1-9fbc96c82428" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 896.194960] env[67893]: DEBUG nova.network.neutron [req-c8eaf8ff-939e-44a9-9a13-82090188c183 req-f42faa8c-bdaa-4a3b-9a0d-a553c8c73658 service nova] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Updated VIF entry in instance network info cache for port 7e21750f-f830-4467-b4c2-9c6d8812f2ed. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 896.195447] env[67893]: DEBUG nova.network.neutron [req-c8eaf8ff-939e-44a9-9a13-82090188c183 req-f42faa8c-bdaa-4a3b-9a0d-a553c8c73658 service nova] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Updating instance_info_cache with network_info: [{"id": "ebeef082-0a4c-4d1a-b8ec-73f9be3f868a", "address": "fa:16:3e:ca:cb:8e", "network": {"id": "48f560b5-cbd0-46e1-905f-430d7fb4f3ab", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1516605202", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "eae3a1869c3e4085af6b6173de554844", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c3e0aae3-33d1-403b-bfaf-306f77a1422e", "external-id": "nsx-vlan-transportzone-211", "segmentation_id": 211, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapebeef082-0a", "ovs_interfaceid": "ebeef082-0a4c-4d1a-b8ec-73f9be3f868a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "bfc3e439-4e18-43de-9ad9-220ea6d5862c", "address": "fa:16:3e:a7:94:8c", "network": {"id": "a57bb317-d459-4fe9-a976-5cd91c514d8e", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1507450088", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "eae3a1869c3e4085af6b6173de554844", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b71230ae-e879-4384-88ce-fe64c86fce22", "external-id": "nsx-vlan-transportzone-473", "segmentation_id": 473, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbfc3e439-4e", "ovs_interfaceid": "bfc3e439-4e18-43de-9ad9-220ea6d5862c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7e21750f-f830-4467-b4c2-9c6d8812f2ed", "address": "fa:16:3e:70:a9:f3", "network": {"id": "48f560b5-cbd0-46e1-905f-430d7fb4f3ab", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1516605202", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.120", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "eae3a1869c3e4085af6b6173de554844", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c3e0aae3-33d1-403b-bfaf-306f77a1422e", "external-id": "nsx-vlan-transportzone-211", "segmentation_id": 211, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7e21750f-f8", "ovs_interfaceid": "7e21750f-f830-4467-b4c2-9c6d8812f2ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 896.207114] env[67893]: DEBUG oslo_concurrency.lockutils [req-c8eaf8ff-939e-44a9-9a13-82090188c183 req-f42faa8c-bdaa-4a3b-9a0d-a553c8c73658 service nova] Releasing lock "refresh_cache-a9656a7e-8a7b-489e-9990-097c1e93e535" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 899.688637] env[67893]: DEBUG oslo_concurrency.lockutils [None req-809f5967-6626-4000-83f7-7adfd49626a7 tempest-ListServerFiltersTestJSON-1036915369 tempest-ListServerFiltersTestJSON-1036915369-project-member] Acquiring lock "f27780c8-3155-480a-bb3c-e93cdac254f2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 899.689847] env[67893]: DEBUG oslo_concurrency.lockutils [None req-809f5967-6626-4000-83f7-7adfd49626a7 tempest-ListServerFiltersTestJSON-1036915369 tempest-ListServerFiltersTestJSON-1036915369-project-member] Lock "f27780c8-3155-480a-bb3c-e93cdac254f2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 900.159785] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a7457600-f5bf-48a1-86e5-d08684aa55c5 tempest-ListServerFiltersTestJSON-1036915369 tempest-ListServerFiltersTestJSON-1036915369-project-member] Acquiring lock "15515d0f-d317-4cc3-a922-c8a64654f4b2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 900.160074] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a7457600-f5bf-48a1-86e5-d08684aa55c5 tempest-ListServerFiltersTestJSON-1036915369 tempest-ListServerFiltersTestJSON-1036915369-project-member] Lock "15515d0f-d317-4cc3-a922-c8a64654f4b2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 900.782733] env[67893]: DEBUG oslo_concurrency.lockutils [None req-268244dd-f80e-4d36-8668-e5378bec8848 tempest-ListServerFiltersTestJSON-1036915369 tempest-ListServerFiltersTestJSON-1036915369-project-member] Acquiring lock "c0e59ef6-c233-490f-ab69-ab198142590a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 900.783335] env[67893]: DEBUG oslo_concurrency.lockutils [None req-268244dd-f80e-4d36-8668-e5378bec8848 tempest-ListServerFiltersTestJSON-1036915369 tempest-ListServerFiltersTestJSON-1036915369-project-member] Lock "c0e59ef6-c233-490f-ab69-ab198142590a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 928.582318] env[67893]: WARNING oslo_vmware.rw_handles [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 928.582318] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 928.582318] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 928.582318] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 928.582318] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 928.582318] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 928.582318] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 928.582318] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 928.582318] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 928.582318] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 928.582318] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 928.582318] env[67893]: ERROR oslo_vmware.rw_handles [ 928.583059] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/1eb78104-db46-41b4-b4ae-080690901cdf/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 928.584746] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 928.585012] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Copying Virtual Disk [datastore1] vmware_temp/1eb78104-db46-41b4-b4ae-080690901cdf/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/1eb78104-db46-41b4-b4ae-080690901cdf/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 928.585334] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a664a611-45b8-446a-a06e-a426a1cffa70 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 928.592883] env[67893]: DEBUG oslo_vmware.api [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Waiting for the task: (returnval){ [ 928.592883] env[67893]: value = "task-3455362" [ 928.592883] env[67893]: _type = "Task" [ 928.592883] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 928.601095] env[67893]: DEBUG oslo_vmware.api [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Task: {'id': task-3455362, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 929.103726] env[67893]: DEBUG oslo_vmware.exceptions [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 929.104015] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 929.104522] env[67893]: ERROR nova.compute.manager [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 929.104522] env[67893]: Faults: ['InvalidArgument'] [ 929.104522] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Traceback (most recent call last): [ 929.104522] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 929.104522] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] yield resources [ 929.104522] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 929.104522] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] self.driver.spawn(context, instance, image_meta, [ 929.104522] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 929.104522] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] self._vmops.spawn(context, instance, image_meta, injected_files, [ 929.104522] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 929.104522] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] self._fetch_image_if_missing(context, vi) [ 929.104522] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 929.104522] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] image_cache(vi, tmp_image_ds_loc) [ 929.104522] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 929.104522] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] vm_util.copy_virtual_disk( [ 929.104522] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 929.104522] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] session._wait_for_task(vmdk_copy_task) [ 929.104522] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 929.104522] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] return self.wait_for_task(task_ref) [ 929.104522] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 929.104522] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] return evt.wait() [ 929.104522] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 929.104522] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] result = hub.switch() [ 929.104522] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 929.104522] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] return self.greenlet.switch() [ 929.104522] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 929.104522] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] self.f(*self.args, **self.kw) [ 929.104522] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 929.104522] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] raise exceptions.translate_fault(task_info.error) [ 929.104522] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 929.104522] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Faults: ['InvalidArgument'] [ 929.104522] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] [ 929.105500] env[67893]: INFO nova.compute.manager [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Terminating instance [ 929.106408] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 929.106610] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 929.106854] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-14716736-1ab0-47ba-8301-f05bd0837d43 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 929.108962] env[67893]: DEBUG nova.compute.manager [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 929.109161] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 929.109853] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc3c3577-96e2-443b-9da9-a71e68957a44 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 929.116422] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 929.116624] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-27eb5f77-2a34-4832-93e5-20f03908024a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 929.118625] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 929.118795] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 929.119726] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-38cc3650-4b1e-4866-a8b1-af290a129952 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 929.123877] env[67893]: DEBUG oslo_vmware.api [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Waiting for the task: (returnval){ [ 929.123877] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]523d4238-7c6c-7428-e214-29f21a3a6f6b" [ 929.123877] env[67893]: _type = "Task" [ 929.123877] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 929.130833] env[67893]: DEBUG oslo_vmware.api [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]523d4238-7c6c-7428-e214-29f21a3a6f6b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 929.186058] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 929.186058] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 929.186276] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Deleting the datastore file [datastore1] 043c631c-bf15-4b4c-9a92-49ea51b6d405 {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 929.186462] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-fea600df-f5b5-4087-a763-5b6ddcd8a2b6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 929.192814] env[67893]: DEBUG oslo_vmware.api [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Waiting for the task: (returnval){ [ 929.192814] env[67893]: value = "task-3455364" [ 929.192814] env[67893]: _type = "Task" [ 929.192814] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 929.200377] env[67893]: DEBUG oslo_vmware.api [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Task: {'id': task-3455364, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 929.634025] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 929.634307] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Creating directory with path [datastore1] vmware_temp/c8be3c10-d9c4-41e3-b693-dc99435c6915/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 929.634505] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5283a99b-b1e8-497e-a450-e43ca926de52 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 929.645729] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Created directory with path [datastore1] vmware_temp/c8be3c10-d9c4-41e3-b693-dc99435c6915/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 929.645935] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Fetch image to [datastore1] vmware_temp/c8be3c10-d9c4-41e3-b693-dc99435c6915/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 929.646132] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/c8be3c10-d9c4-41e3-b693-dc99435c6915/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 929.646835] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3dee847-24d4-4f93-9a3f-574a74bae03e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 929.653355] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94779245-f5e3-4bea-bcff-e649d04a2875 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 929.662018] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff591c5a-db63-4740-8704-30a0986b987a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 929.691305] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4149401a-e437-4e5c-9264-798a535537b9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 929.701988] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-fb1cb2d5-76f0-4549-bd79-a61b262e5ff4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 929.703589] env[67893]: DEBUG oslo_vmware.api [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Task: {'id': task-3455364, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077904} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 929.703830] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 929.704015] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 929.704217] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 929.704368] env[67893]: INFO nova.compute.manager [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Took 0.60 seconds to destroy the instance on the hypervisor. [ 929.706456] env[67893]: DEBUG nova.compute.claims [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 929.706626] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 929.706877] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 929.736611] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 929.849554] env[67893]: DEBUG oslo_vmware.rw_handles [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c8be3c10-d9c4-41e3-b693-dc99435c6915/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 929.912582] env[67893]: DEBUG oslo_vmware.rw_handles [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 929.912799] env[67893]: DEBUG oslo_vmware.rw_handles [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c8be3c10-d9c4-41e3-b693-dc99435c6915/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 930.146174] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09fc7778-fea3-4002-a09c-e8ce7c080ed8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 930.153980] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9883bcae-1005-480f-8976-d20bde9f5e02 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 930.183116] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acc6ea83-2170-43c9-8e76-56f1e2b233fd {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 930.189741] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44dccb15-0f85-4f77-9ac7-307a854bd35b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 930.202661] env[67893]: DEBUG nova.compute.provider_tree [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 930.211406] env[67893]: DEBUG nova.scheduler.client.report [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 930.225616] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.519s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 930.226171] env[67893]: ERROR nova.compute.manager [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 930.226171] env[67893]: Faults: ['InvalidArgument'] [ 930.226171] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Traceback (most recent call last): [ 930.226171] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 930.226171] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] self.driver.spawn(context, instance, image_meta, [ 930.226171] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 930.226171] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] self._vmops.spawn(context, instance, image_meta, injected_files, [ 930.226171] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 930.226171] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] self._fetch_image_if_missing(context, vi) [ 930.226171] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 930.226171] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] image_cache(vi, tmp_image_ds_loc) [ 930.226171] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 930.226171] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] vm_util.copy_virtual_disk( [ 930.226171] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 930.226171] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] session._wait_for_task(vmdk_copy_task) [ 930.226171] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 930.226171] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] return self.wait_for_task(task_ref) [ 930.226171] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 930.226171] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] return evt.wait() [ 930.226171] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 930.226171] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] result = hub.switch() [ 930.226171] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 930.226171] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] return self.greenlet.switch() [ 930.226171] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 930.226171] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] self.f(*self.args, **self.kw) [ 930.226171] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 930.226171] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] raise exceptions.translate_fault(task_info.error) [ 930.226171] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 930.226171] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Faults: ['InvalidArgument'] [ 930.226171] env[67893]: ERROR nova.compute.manager [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] [ 930.227183] env[67893]: DEBUG nova.compute.utils [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 930.228252] env[67893]: DEBUG nova.compute.manager [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Build of instance 043c631c-bf15-4b4c-9a92-49ea51b6d405 was re-scheduled: A specified parameter was not correct: fileType [ 930.228252] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 930.228652] env[67893]: DEBUG nova.compute.manager [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 930.228826] env[67893]: DEBUG nova.compute.manager [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 930.229022] env[67893]: DEBUG nova.compute.manager [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 930.229201] env[67893]: DEBUG nova.network.neutron [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 930.586849] env[67893]: DEBUG nova.network.neutron [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 930.604023] env[67893]: INFO nova.compute.manager [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Took 0.37 seconds to deallocate network for instance. [ 930.709060] env[67893]: INFO nova.scheduler.client.report [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Deleted allocations for instance 043c631c-bf15-4b4c-9a92-49ea51b6d405 [ 930.734685] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eeadb39b-de75-466c-bda7-159c8cd3a94c tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Lock "043c631c-bf15-4b4c-9a92-49ea51b6d405" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 291.387s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 930.736142] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b6a4dab1-4cfc-40f1-876a-0d8b309b0cb1 tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Lock "043c631c-bf15-4b4c-9a92-49ea51b6d405" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 93.927s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 930.736142] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b6a4dab1-4cfc-40f1-876a-0d8b309b0cb1 tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Acquiring lock "043c631c-bf15-4b4c-9a92-49ea51b6d405-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 930.736330] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b6a4dab1-4cfc-40f1-876a-0d8b309b0cb1 tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Lock "043c631c-bf15-4b4c-9a92-49ea51b6d405-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 930.736412] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b6a4dab1-4cfc-40f1-876a-0d8b309b0cb1 tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Lock "043c631c-bf15-4b4c-9a92-49ea51b6d405-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 930.738866] env[67893]: INFO nova.compute.manager [None req-b6a4dab1-4cfc-40f1-876a-0d8b309b0cb1 tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Terminating instance [ 930.740132] env[67893]: DEBUG nova.compute.manager [None req-b6a4dab1-4cfc-40f1-876a-0d8b309b0cb1 tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 930.740331] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-b6a4dab1-4cfc-40f1-876a-0d8b309b0cb1 tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 930.740807] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-df4a0b38-863e-48a6-b5af-bfb8be8d2848 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 930.745635] env[67893]: DEBUG nova.compute.manager [None req-9aeba813-a345-45e6-936d-49a7ba9d1b0a tempest-ServersAdminNegativeTestJSON-1263096693 tempest-ServersAdminNegativeTestJSON-1263096693-project-member] [instance: cb485828-0620-48fd-a9d4-a83e690f4675] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 930.751737] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c085f9f-fdec-4e48-a0c5-29ffb970aa90 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 930.770771] env[67893]: DEBUG nova.compute.manager [None req-9aeba813-a345-45e6-936d-49a7ba9d1b0a tempest-ServersAdminNegativeTestJSON-1263096693 tempest-ServersAdminNegativeTestJSON-1263096693-project-member] [instance: cb485828-0620-48fd-a9d4-a83e690f4675] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 930.781380] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-b6a4dab1-4cfc-40f1-876a-0d8b309b0cb1 tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 043c631c-bf15-4b4c-9a92-49ea51b6d405 could not be found. [ 930.781380] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-b6a4dab1-4cfc-40f1-876a-0d8b309b0cb1 tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 930.781586] env[67893]: INFO nova.compute.manager [None req-b6a4dab1-4cfc-40f1-876a-0d8b309b0cb1 tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Took 0.04 seconds to destroy the instance on the hypervisor. [ 930.781693] env[67893]: DEBUG oslo.service.loopingcall [None req-b6a4dab1-4cfc-40f1-876a-0d8b309b0cb1 tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 930.782127] env[67893]: DEBUG nova.compute.manager [-] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 930.782226] env[67893]: DEBUG nova.network.neutron [-] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 930.797205] env[67893]: DEBUG oslo_concurrency.lockutils [None req-9aeba813-a345-45e6-936d-49a7ba9d1b0a tempest-ServersAdminNegativeTestJSON-1263096693 tempest-ServersAdminNegativeTestJSON-1263096693-project-member] Lock "cb485828-0620-48fd-a9d4-a83e690f4675" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 246.139s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 930.807718] env[67893]: DEBUG nova.compute.manager [None req-7f6166e4-36b0-4dfa-827b-6fe59bbe057c tempest-ServerPasswordTestJSON-451294384 tempest-ServerPasswordTestJSON-451294384-project-member] [instance: dd3d49f4-83c5-4a83-9674-fed5e190743c] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 930.810452] env[67893]: DEBUG nova.network.neutron [-] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 930.817384] env[67893]: INFO nova.compute.manager [-] [instance: 043c631c-bf15-4b4c-9a92-49ea51b6d405] Took 0.04 seconds to deallocate network for instance. [ 930.835038] env[67893]: DEBUG nova.compute.manager [None req-7f6166e4-36b0-4dfa-827b-6fe59bbe057c tempest-ServerPasswordTestJSON-451294384 tempest-ServerPasswordTestJSON-451294384-project-member] [instance: dd3d49f4-83c5-4a83-9674-fed5e190743c] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 930.855275] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7f6166e4-36b0-4dfa-827b-6fe59bbe057c tempest-ServerPasswordTestJSON-451294384 tempest-ServerPasswordTestJSON-451294384-project-member] Lock "dd3d49f4-83c5-4a83-9674-fed5e190743c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 242.069s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 930.863732] env[67893]: DEBUG nova.compute.manager [None req-5584c123-651e-44db-84e7-9991ba49463b tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 9e3bbca4-2031-4a02-819c-2c9cf720eba9] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 930.888416] env[67893]: DEBUG nova.compute.manager [None req-5584c123-651e-44db-84e7-9991ba49463b tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] [instance: 9e3bbca4-2031-4a02-819c-2c9cf720eba9] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 930.906681] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b6a4dab1-4cfc-40f1-876a-0d8b309b0cb1 tempest-ServerDiagnosticsNegativeTest-1297473187 tempest-ServerDiagnosticsNegativeTest-1297473187-project-member] Lock "043c631c-bf15-4b4c-9a92-49ea51b6d405" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.171s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 930.912143] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5584c123-651e-44db-84e7-9991ba49463b tempest-MigrationsAdminTest-2095217060 tempest-MigrationsAdminTest-2095217060-project-member] Lock "9e3bbca4-2031-4a02-819c-2c9cf720eba9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 241.406s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 930.921063] env[67893]: DEBUG nova.compute.manager [None req-eb89611c-e2f2-4ff4-a31a-613d7b7b9565 tempest-ServersTestManualDisk-14099750 tempest-ServersTestManualDisk-14099750-project-member] [instance: 88053de1-3cc2-4776-a56e-b34aa0c93764] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 930.947376] env[67893]: DEBUG nova.compute.manager [None req-eb89611c-e2f2-4ff4-a31a-613d7b7b9565 tempest-ServersTestManualDisk-14099750 tempest-ServersTestManualDisk-14099750-project-member] [instance: 88053de1-3cc2-4776-a56e-b34aa0c93764] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 930.967191] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eb89611c-e2f2-4ff4-a31a-613d7b7b9565 tempest-ServersTestManualDisk-14099750 tempest-ServersTestManualDisk-14099750-project-member] Lock "88053de1-3cc2-4776-a56e-b34aa0c93764" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 226.950s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 930.978018] env[67893]: DEBUG nova.compute.manager [None req-74732dd9-b3b5-4eab-800b-ffe44686edab tempest-InstanceActionsTestJSON-1015446130 tempest-InstanceActionsTestJSON-1015446130-project-member] [instance: cb76498a-b404-40f3-ac3f-93aea525abee] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 931.002955] env[67893]: DEBUG nova.compute.manager [None req-74732dd9-b3b5-4eab-800b-ffe44686edab tempest-InstanceActionsTestJSON-1015446130 tempest-InstanceActionsTestJSON-1015446130-project-member] [instance: cb76498a-b404-40f3-ac3f-93aea525abee] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 931.023120] env[67893]: DEBUG oslo_concurrency.lockutils [None req-74732dd9-b3b5-4eab-800b-ffe44686edab tempest-InstanceActionsTestJSON-1015446130 tempest-InstanceActionsTestJSON-1015446130-project-member] Lock "cb76498a-b404-40f3-ac3f-93aea525abee" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 217.082s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 931.032319] env[67893]: DEBUG nova.compute.manager [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 931.084128] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 931.084380] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 931.085821] env[67893]: INFO nova.compute.claims [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 931.448255] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dbd4de31-c4e3-42ab-97b3-69d34106f2fc {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 931.455845] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0c9f868-caeb-4be1-b674-74effc7bfd50 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 931.484636] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7083ca4-312b-4984-a687-d93ad27fceec {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 931.492453] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8be58bee-46c7-4492-8820-840d5d1ad494 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 931.505495] env[67893]: DEBUG nova.compute.provider_tree [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 931.513875] env[67893]: DEBUG nova.scheduler.client.report [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 931.532897] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.448s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 931.533431] env[67893]: DEBUG nova.compute.manager [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 931.568569] env[67893]: DEBUG nova.compute.utils [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 931.569958] env[67893]: DEBUG nova.compute.manager [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 931.570168] env[67893]: DEBUG nova.network.neutron [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 931.581144] env[67893]: DEBUG nova.compute.manager [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 931.651352] env[67893]: DEBUG nova.policy [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1ffbac6fa16441879fe4ffabad3866e0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '502e83adfe03457898b6e2050b91a610', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 931.672101] env[67893]: DEBUG nova.compute.manager [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 931.711095] env[67893]: DEBUG nova.virt.hardware [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 931.711412] env[67893]: DEBUG nova.virt.hardware [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 931.711583] env[67893]: DEBUG nova.virt.hardware [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 931.711765] env[67893]: DEBUG nova.virt.hardware [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 931.711917] env[67893]: DEBUG nova.virt.hardware [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 931.712267] env[67893]: DEBUG nova.virt.hardware [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 931.712586] env[67893]: DEBUG nova.virt.hardware [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 931.712703] env[67893]: DEBUG nova.virt.hardware [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 931.713286] env[67893]: DEBUG nova.virt.hardware [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 931.713286] env[67893]: DEBUG nova.virt.hardware [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 931.713474] env[67893]: DEBUG nova.virt.hardware [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 931.715125] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0dec28d-bbf9-4069-8946-ecb079a95427 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 931.725480] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-804b5aaf-423d-4dd8-9327-cdd42458c674 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 932.156182] env[67893]: DEBUG nova.network.neutron [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Successfully created port: fba339f9-6c6f-4fe3-9b87-3efd1e84b077 {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 933.211736] env[67893]: DEBUG nova.network.neutron [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Successfully updated port: fba339f9-6c6f-4fe3-9b87-3efd1e84b077 {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 933.233204] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Acquiring lock "refresh_cache-fcae7119-6233-4a52-9e52-1147f2b10ddc" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 933.234210] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Acquired lock "refresh_cache-fcae7119-6233-4a52-9e52-1147f2b10ddc" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 933.234489] env[67893]: DEBUG nova.network.neutron [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 933.284803] env[67893]: DEBUG nova.compute.manager [req-16b5ddb8-477b-4b9b-905d-5d7994113b2a req-49b64359-cb5a-4239-b757-bdfada9cf2a3 service nova] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Received event network-vif-plugged-fba339f9-6c6f-4fe3-9b87-3efd1e84b077 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 933.285040] env[67893]: DEBUG oslo_concurrency.lockutils [req-16b5ddb8-477b-4b9b-905d-5d7994113b2a req-49b64359-cb5a-4239-b757-bdfada9cf2a3 service nova] Acquiring lock "fcae7119-6233-4a52-9e52-1147f2b10ddc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 933.285403] env[67893]: DEBUG oslo_concurrency.lockutils [req-16b5ddb8-477b-4b9b-905d-5d7994113b2a req-49b64359-cb5a-4239-b757-bdfada9cf2a3 service nova] Lock "fcae7119-6233-4a52-9e52-1147f2b10ddc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 933.285582] env[67893]: DEBUG oslo_concurrency.lockutils [req-16b5ddb8-477b-4b9b-905d-5d7994113b2a req-49b64359-cb5a-4239-b757-bdfada9cf2a3 service nova] Lock "fcae7119-6233-4a52-9e52-1147f2b10ddc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 933.286813] env[67893]: DEBUG nova.compute.manager [req-16b5ddb8-477b-4b9b-905d-5d7994113b2a req-49b64359-cb5a-4239-b757-bdfada9cf2a3 service nova] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] No waiting events found dispatching network-vif-plugged-fba339f9-6c6f-4fe3-9b87-3efd1e84b077 {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 933.287077] env[67893]: WARNING nova.compute.manager [req-16b5ddb8-477b-4b9b-905d-5d7994113b2a req-49b64359-cb5a-4239-b757-bdfada9cf2a3 service nova] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Received unexpected event network-vif-plugged-fba339f9-6c6f-4fe3-9b87-3efd1e84b077 for instance with vm_state building and task_state spawning. [ 933.291695] env[67893]: DEBUG nova.network.neutron [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 933.518035] env[67893]: DEBUG nova.network.neutron [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Updating instance_info_cache with network_info: [{"id": "fba339f9-6c6f-4fe3-9b87-3efd1e84b077", "address": "fa:16:3e:55:e0:e3", "network": {"id": "634876e1-c856-4798-a4cc-e3a5e15d71fe", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1835353378-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "502e83adfe03457898b6e2050b91a610", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "db1f7867-8524-469c-ab47-d2c9e2751d98", "external-id": "nsx-vlan-transportzone-130", "segmentation_id": 130, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfba339f9-6c", "ovs_interfaceid": "fba339f9-6c6f-4fe3-9b87-3efd1e84b077", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 933.530121] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Releasing lock "refresh_cache-fcae7119-6233-4a52-9e52-1147f2b10ddc" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 933.530432] env[67893]: DEBUG nova.compute.manager [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Instance network_info: |[{"id": "fba339f9-6c6f-4fe3-9b87-3efd1e84b077", "address": "fa:16:3e:55:e0:e3", "network": {"id": "634876e1-c856-4798-a4cc-e3a5e15d71fe", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1835353378-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "502e83adfe03457898b6e2050b91a610", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "db1f7867-8524-469c-ab47-d2c9e2751d98", "external-id": "nsx-vlan-transportzone-130", "segmentation_id": 130, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfba339f9-6c", "ovs_interfaceid": "fba339f9-6c6f-4fe3-9b87-3efd1e84b077", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 933.530840] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:55:e0:e3', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'db1f7867-8524-469c-ab47-d2c9e2751d98', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'fba339f9-6c6f-4fe3-9b87-3efd1e84b077', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 933.538939] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Creating folder: Project (502e83adfe03457898b6e2050b91a610). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 933.540048] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a0eea952-31cd-4304-806a-bda3677a01c2 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 933.552188] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Created folder: Project (502e83adfe03457898b6e2050b91a610) in parent group-v689771. [ 933.552394] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Creating folder: Instances. Parent ref: group-v689822. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 933.552631] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-42c34369-63da-4d83-872f-0073a05671d3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 933.562894] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Created folder: Instances in parent group-v689822. [ 933.563156] env[67893]: DEBUG oslo.service.loopingcall [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 933.563358] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 933.563572] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-cda5cd2e-6f9d-4db3-87cb-375b73f32708 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 933.586685] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 933.586685] env[67893]: value = "task-3455367" [ 933.586685] env[67893]: _type = "Task" [ 933.586685] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 933.596160] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455367, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 934.101041] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455367, 'name': CreateVM_Task, 'duration_secs': 0.324656} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 934.101481] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 934.102497] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 934.102765] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 934.103278] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 934.103643] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-235600a6-31ed-4f6b-83c8-237d625c653a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 934.114024] env[67893]: DEBUG oslo_vmware.api [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Waiting for the task: (returnval){ [ 934.114024] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]5242a598-7e04-dcb7-5c3b-f78d71cc4a64" [ 934.114024] env[67893]: _type = "Task" [ 934.114024] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 934.126811] env[67893]: DEBUG oslo_vmware.api [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]5242a598-7e04-dcb7-5c3b-f78d71cc4a64, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 934.625542] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 934.625913] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 934.626190] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 935.347027] env[67893]: DEBUG oslo_concurrency.lockutils [None req-11017f01-7cf4-413e-9b37-8dbf3743e9a2 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Acquiring lock "fcae7119-6233-4a52-9e52-1147f2b10ddc" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 935.522659] env[67893]: DEBUG nova.compute.manager [req-c9d1b771-e340-4843-ad8e-b862849a497a req-b7b6c3a4-c082-4587-889e-6764c311f3e1 service nova] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Received event network-changed-fba339f9-6c6f-4fe3-9b87-3efd1e84b077 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 935.522955] env[67893]: DEBUG nova.compute.manager [req-c9d1b771-e340-4843-ad8e-b862849a497a req-b7b6c3a4-c082-4587-889e-6764c311f3e1 service nova] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Refreshing instance network info cache due to event network-changed-fba339f9-6c6f-4fe3-9b87-3efd1e84b077. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 935.523083] env[67893]: DEBUG oslo_concurrency.lockutils [req-c9d1b771-e340-4843-ad8e-b862849a497a req-b7b6c3a4-c082-4587-889e-6764c311f3e1 service nova] Acquiring lock "refresh_cache-fcae7119-6233-4a52-9e52-1147f2b10ddc" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 935.523217] env[67893]: DEBUG oslo_concurrency.lockutils [req-c9d1b771-e340-4843-ad8e-b862849a497a req-b7b6c3a4-c082-4587-889e-6764c311f3e1 service nova] Acquired lock "refresh_cache-fcae7119-6233-4a52-9e52-1147f2b10ddc" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 935.523376] env[67893]: DEBUG nova.network.neutron [req-c9d1b771-e340-4843-ad8e-b862849a497a req-b7b6c3a4-c082-4587-889e-6764c311f3e1 service nova] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Refreshing network info cache for port fba339f9-6c6f-4fe3-9b87-3efd1e84b077 {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 936.356190] env[67893]: DEBUG nova.network.neutron [req-c9d1b771-e340-4843-ad8e-b862849a497a req-b7b6c3a4-c082-4587-889e-6764c311f3e1 service nova] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Updated VIF entry in instance network info cache for port fba339f9-6c6f-4fe3-9b87-3efd1e84b077. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 936.356554] env[67893]: DEBUG nova.network.neutron [req-c9d1b771-e340-4843-ad8e-b862849a497a req-b7b6c3a4-c082-4587-889e-6764c311f3e1 service nova] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Updating instance_info_cache with network_info: [{"id": "fba339f9-6c6f-4fe3-9b87-3efd1e84b077", "address": "fa:16:3e:55:e0:e3", "network": {"id": "634876e1-c856-4798-a4cc-e3a5e15d71fe", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1835353378-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "502e83adfe03457898b6e2050b91a610", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "db1f7867-8524-469c-ab47-d2c9e2751d98", "external-id": "nsx-vlan-transportzone-130", "segmentation_id": 130, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfba339f9-6c", "ovs_interfaceid": "fba339f9-6c6f-4fe3-9b87-3efd1e84b077", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 936.370734] env[67893]: DEBUG oslo_concurrency.lockutils [req-c9d1b771-e340-4843-ad8e-b862849a497a req-b7b6c3a4-c082-4587-889e-6764c311f3e1 service nova] Releasing lock "refresh_cache-fcae7119-6233-4a52-9e52-1147f2b10ddc" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 937.859553] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 938.888901] env[67893]: DEBUG oslo_concurrency.lockutils [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Acquiring lock "efdb0a7e-403d-4de5-8c09-72b9c8f9cd79" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 938.889277] env[67893]: DEBUG oslo_concurrency.lockutils [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Lock "efdb0a7e-403d-4de5-8c09-72b9c8f9cd79" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 939.858608] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 940.854269] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 940.854558] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 940.877779] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 940.877945] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 940.878082] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 940.897325] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 940.897483] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 940.897613] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 940.897739] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 940.897864] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 940.897987] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 940.898125] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 940.898243] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 940.898402] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 940.898526] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 940.898647] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 941.858905] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 941.859240] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 942.859382] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 942.859645] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 942.859768] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 942.871566] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 942.871796] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 942.871977] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 942.872337] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 942.873383] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81a8ba2d-7795-44b6-bc63-9fddf883d5e0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 942.882559] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5565480c-cff5-4211-a3e4-6f85a75a01e3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 942.897685] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05ecc1f5-f579-4280-b436-9a5886567f06 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 942.904030] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b32556c2-8552-418e-bad7-df74fc63c1ee {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 942.932585] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180996MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 942.932755] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 942.932956] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 943.009105] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 943.009105] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2f69fae8-d060-4156-8880-071f5ee1f969 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 943.009105] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2256af1c-4ff8-46b9-b568-c25ce8886e5f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 943.009105] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 6520080a-8bf1-4803-9099-87c3ba6e28e4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 943.009105] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 19ab9782-9131-46ba-bbf2-cc021953046e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 943.009105] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2eb8d698-9436-4e91-bd10-5f5200415144 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 943.009105] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 943.009105] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 943.009105] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a9656a7e-8a7b-489e-9990-097c1e93e535 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 943.009105] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance fcae7119-6233-4a52-9e52-1147f2b10ddc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 943.020010] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 8831483f-3fbb-4463-9f8f-868d46bb3e4e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.031868] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2553f3c0-0988-4e11-a138-7e5f71e71f48 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.042134] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c05df6c1-e4c9-4276-9981-e80e584d540c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.051446] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 89e2963e-83e2-4e29-843d-7c15abdf78bc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.061095] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 88a48088-829d-40c1-85e1-6e78b8f5cea9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.070495] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5a24adaf-bced-4488-9ccb-fc996b2ba154 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.081053] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance d8f75420-059d-4af1-8545-b5c4f67f4fe3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.090844] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a6771d3c-90ff-4403-9124-e74d74256db8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.100907] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 993c926b-bdc5-4f7e-992a-aac8c658ea6c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.110373] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 429fffb6-8355-419d-8cbb-a406d723802b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.120071] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 8c9b4750-db4e-446b-b108-fc675c6f4c69 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.129448] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance aa02f91b-b125-42af-be9a-c565ed041288 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.140371] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance e720d161-0c76-47ab-8d24-e465109d6e8c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.150655] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5e7e68fb-f8a5-46c9-b0b1-9fbc96c82428 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.160532] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance f27780c8-3155-480a-bb3c-e93cdac254f2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.169456] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 15515d0f-d317-4cc3-a922-c8a64654f4b2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.179966] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c0e59ef6-c233-490f-ab69-ab198142590a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.189026] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance efdb0a7e-403d-4de5-8c09-72b9c8f9cd79 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.189026] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 943.189026] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 943.497502] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-496d1584-b46a-407f-aa5e-e44737b1aa5f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 943.505168] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67257938-10d3-4734-ad10-8051b783a014 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 943.534041] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-225d1342-722b-4cdf-af0a-e9a6cb8a9e7b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 943.540707] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2169834b-e55e-4087-8729-1f0d7d77f5af {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 943.553193] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 943.561210] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 943.574846] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 943.575030] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.642s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 944.574692] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 977.301871] env[67893]: WARNING oslo_vmware.rw_handles [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 977.301871] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 977.301871] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 977.301871] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 977.301871] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 977.301871] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 977.301871] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 977.301871] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 977.301871] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 977.301871] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 977.301871] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 977.301871] env[67893]: ERROR oslo_vmware.rw_handles [ 977.302676] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/c8be3c10-d9c4-41e3-b693-dc99435c6915/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 977.304171] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 977.304440] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Copying Virtual Disk [datastore1] vmware_temp/c8be3c10-d9c4-41e3-b693-dc99435c6915/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/c8be3c10-d9c4-41e3-b693-dc99435c6915/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 977.304717] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d79f3af1-7bd9-491c-8952-b4bb7f7a4550 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 977.314352] env[67893]: DEBUG oslo_vmware.api [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Waiting for the task: (returnval){ [ 977.314352] env[67893]: value = "task-3455368" [ 977.314352] env[67893]: _type = "Task" [ 977.314352] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 977.320822] env[67893]: DEBUG oslo_vmware.api [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Task: {'id': task-3455368, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 977.825225] env[67893]: DEBUG oslo_vmware.exceptions [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 977.825702] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 977.826316] env[67893]: ERROR nova.compute.manager [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 977.826316] env[67893]: Faults: ['InvalidArgument'] [ 977.826316] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Traceback (most recent call last): [ 977.826316] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 977.826316] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] yield resources [ 977.826316] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 977.826316] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] self.driver.spawn(context, instance, image_meta, [ 977.826316] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 977.826316] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 977.826316] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 977.826316] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] self._fetch_image_if_missing(context, vi) [ 977.826316] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 977.826316] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] image_cache(vi, tmp_image_ds_loc) [ 977.826316] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 977.826316] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] vm_util.copy_virtual_disk( [ 977.826316] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 977.826316] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] session._wait_for_task(vmdk_copy_task) [ 977.826316] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 977.826316] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] return self.wait_for_task(task_ref) [ 977.826316] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 977.826316] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] return evt.wait() [ 977.826316] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 977.826316] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] result = hub.switch() [ 977.826316] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 977.826316] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] return self.greenlet.switch() [ 977.826316] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 977.826316] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] self.f(*self.args, **self.kw) [ 977.826316] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 977.826316] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] raise exceptions.translate_fault(task_info.error) [ 977.826316] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 977.826316] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Faults: ['InvalidArgument'] [ 977.826316] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] [ 977.827487] env[67893]: INFO nova.compute.manager [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Terminating instance [ 977.828239] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 977.828450] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 977.829120] env[67893]: DEBUG nova.compute.manager [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 977.829313] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 977.829568] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a63dd5cf-1e9d-4be1-81cc-2ccd6f66ca0e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 977.832212] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02d157a7-f588-4c6e-b182-1a27441a4cb9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 977.839363] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 977.840053] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-38ea7e35-02d4-432f-8316-d5aa0bb815cc {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 977.842469] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 977.842668] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 977.843639] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-acc5c5ba-a90a-4f9a-9263-d28e3b8abef6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 977.849222] env[67893]: DEBUG oslo_vmware.api [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Waiting for the task: (returnval){ [ 977.849222] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]526c2b38-bbff-6a74-abfc-bc208730ca45" [ 977.849222] env[67893]: _type = "Task" [ 977.849222] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 977.856599] env[67893]: DEBUG oslo_vmware.api [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]526c2b38-bbff-6a74-abfc-bc208730ca45, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 977.911370] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 977.911625] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 977.911838] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Deleting the datastore file [datastore1] 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5 {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 977.912084] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f3ef7e3c-c9ad-4d04-9339-5af16520aab5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 977.917685] env[67893]: DEBUG oslo_vmware.api [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Waiting for the task: (returnval){ [ 977.917685] env[67893]: value = "task-3455370" [ 977.917685] env[67893]: _type = "Task" [ 977.917685] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 977.925732] env[67893]: DEBUG oslo_vmware.api [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Task: {'id': task-3455370, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 978.360160] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 978.360160] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Creating directory with path [datastore1] vmware_temp/79868114-00a0-44b2-b68b-809be3dd06e8/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 978.360437] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f9895c56-510e-4d67-aff8-134a01101cb6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 978.371836] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Created directory with path [datastore1] vmware_temp/79868114-00a0-44b2-b68b-809be3dd06e8/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 978.372054] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Fetch image to [datastore1] vmware_temp/79868114-00a0-44b2-b68b-809be3dd06e8/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 978.372324] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/79868114-00a0-44b2-b68b-809be3dd06e8/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 978.373006] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3051ff70-a09e-4d54-985c-96ff035bca9b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 978.379789] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47abd3e6-dbcc-40e2-a526-965b9a1aa8af {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 978.388661] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93e2c3c3-e2e7-44c9-bfd9-52bb8d51ef20 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 978.423512] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21e8474e-1b91-482e-a8f8-20f5028149a3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 978.430567] env[67893]: DEBUG oslo_vmware.api [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Task: {'id': task-3455370, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.080886} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 978.432087] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 978.432285] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 978.432466] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 978.432625] env[67893]: INFO nova.compute.manager [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Took 0.60 seconds to destroy the instance on the hypervisor. [ 978.434426] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c47717b4-8b2f-4afa-b41d-e6857e48609c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 978.436372] env[67893]: DEBUG nova.compute.claims [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 978.436547] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 978.436755] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 978.466293] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 978.520865] env[67893]: DEBUG oslo_vmware.rw_handles [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/79868114-00a0-44b2-b68b-809be3dd06e8/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 978.583952] env[67893]: DEBUG oslo_vmware.rw_handles [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 978.583952] env[67893]: DEBUG oslo_vmware.rw_handles [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/79868114-00a0-44b2-b68b-809be3dd06e8/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 978.899504] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ee69254-dbdf-48b8-87a7-106bfc773d10 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 978.907415] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07afab30-3918-41ba-a374-f6c2c56abcd6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 978.937473] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2ee6890-ce78-4be8-abb3-e5b5969f4fe5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 978.944998] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f56f4f1c-3145-4d80-9cde-b229e771e8dd {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 978.958662] env[67893]: DEBUG nova.compute.provider_tree [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 978.967075] env[67893]: DEBUG nova.scheduler.client.report [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 978.981734] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.544s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 978.981917] env[67893]: ERROR nova.compute.manager [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 978.981917] env[67893]: Faults: ['InvalidArgument'] [ 978.981917] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Traceback (most recent call last): [ 978.981917] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 978.981917] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] self.driver.spawn(context, instance, image_meta, [ 978.981917] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 978.981917] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 978.981917] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 978.981917] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] self._fetch_image_if_missing(context, vi) [ 978.981917] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 978.981917] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] image_cache(vi, tmp_image_ds_loc) [ 978.981917] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 978.981917] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] vm_util.copy_virtual_disk( [ 978.981917] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 978.981917] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] session._wait_for_task(vmdk_copy_task) [ 978.981917] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 978.981917] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] return self.wait_for_task(task_ref) [ 978.981917] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 978.981917] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] return evt.wait() [ 978.981917] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 978.981917] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] result = hub.switch() [ 978.981917] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 978.981917] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] return self.greenlet.switch() [ 978.981917] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 978.981917] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] self.f(*self.args, **self.kw) [ 978.981917] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 978.981917] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] raise exceptions.translate_fault(task_info.error) [ 978.981917] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 978.981917] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Faults: ['InvalidArgument'] [ 978.981917] env[67893]: ERROR nova.compute.manager [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] [ 978.982988] env[67893]: DEBUG nova.compute.utils [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 978.984369] env[67893]: DEBUG nova.compute.manager [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Build of instance 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5 was re-scheduled: A specified parameter was not correct: fileType [ 978.984369] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 978.987404] env[67893]: DEBUG nova.compute.manager [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 978.987404] env[67893]: DEBUG nova.compute.manager [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 978.987404] env[67893]: DEBUG nova.compute.manager [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 978.987404] env[67893]: DEBUG nova.network.neutron [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 979.412326] env[67893]: DEBUG nova.network.neutron [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 979.431991] env[67893]: INFO nova.compute.manager [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Took 0.44 seconds to deallocate network for instance. [ 979.538050] env[67893]: INFO nova.scheduler.client.report [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Deleted allocations for instance 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5 [ 979.561036] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bfd69b1f-6202-42d6-a3a4-bd33c208256d tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Lock "96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 335.435s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 979.565146] env[67893]: DEBUG oslo_concurrency.lockutils [None req-26e9203a-f9a7-4684-a039-c28d70896566 tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Lock "96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 136.147s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 979.565146] env[67893]: DEBUG oslo_concurrency.lockutils [None req-26e9203a-f9a7-4684-a039-c28d70896566 tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Acquiring lock "96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 979.565146] env[67893]: DEBUG oslo_concurrency.lockutils [None req-26e9203a-f9a7-4684-a039-c28d70896566 tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Lock "96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 979.565146] env[67893]: DEBUG oslo_concurrency.lockutils [None req-26e9203a-f9a7-4684-a039-c28d70896566 tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Lock "96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 979.565146] env[67893]: INFO nova.compute.manager [None req-26e9203a-f9a7-4684-a039-c28d70896566 tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Terminating instance [ 979.566495] env[67893]: DEBUG nova.compute.manager [None req-26e9203a-f9a7-4684-a039-c28d70896566 tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 979.566638] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-26e9203a-f9a7-4684-a039-c28d70896566 tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 979.567150] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-06938f7c-3e43-45a8-abe3-ae0065debfdb {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 979.577075] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5d83954-dd69-4678-8b6e-8ff683a8fd1b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 979.588958] env[67893]: DEBUG nova.compute.manager [None req-9782ff27-31db-4875-8f2e-a64ff6162396 tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: 8831483f-3fbb-4463-9f8f-868d46bb3e4e] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 979.610154] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-26e9203a-f9a7-4684-a039-c28d70896566 tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5 could not be found. [ 979.610154] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-26e9203a-f9a7-4684-a039-c28d70896566 tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 979.610420] env[67893]: INFO nova.compute.manager [None req-26e9203a-f9a7-4684-a039-c28d70896566 tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Took 0.04 seconds to destroy the instance on the hypervisor. [ 979.610550] env[67893]: DEBUG oslo.service.loopingcall [None req-26e9203a-f9a7-4684-a039-c28d70896566 tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 979.610803] env[67893]: DEBUG nova.compute.manager [-] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 979.610803] env[67893]: DEBUG nova.network.neutron [-] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 979.621922] env[67893]: DEBUG nova.compute.manager [None req-9782ff27-31db-4875-8f2e-a64ff6162396 tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] [instance: 8831483f-3fbb-4463-9f8f-868d46bb3e4e] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 979.659316] env[67893]: DEBUG nova.network.neutron [-] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 979.684843] env[67893]: INFO nova.compute.manager [-] [instance: 96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5] Took 0.07 seconds to deallocate network for instance. [ 979.690747] env[67893]: DEBUG oslo_concurrency.lockutils [None req-9782ff27-31db-4875-8f2e-a64ff6162396 tempest-DeleteServersAdminTestJSON-2022472478 tempest-DeleteServersAdminTestJSON-2022472478-project-member] Lock "8831483f-3fbb-4463-9f8f-868d46bb3e4e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 198.270s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 979.703955] env[67893]: DEBUG nova.compute.manager [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 979.792918] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 979.794031] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 979.794753] env[67893]: INFO nova.compute.claims [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 979.823793] env[67893]: DEBUG oslo_concurrency.lockutils [None req-26e9203a-f9a7-4684-a039-c28d70896566 tempest-ServerMetadataTestJSON-961066541 tempest-ServerMetadataTestJSON-961066541-project-member] Lock "96f86f1d-a1cb-4ec7-b72e-9cf41ff6e4e5" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.261s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 980.165853] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9160551d-210b-458e-bb8e-1c436bc0f96e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 980.176802] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1dbabaf-4264-4d13-922d-0478cea3b062 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 980.207463] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9afac2e7-7bc4-4def-8b86-3934b3e5d44d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 980.214851] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf1eaf82-4148-4286-aa83-f9b7926894eb {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 980.229556] env[67893]: DEBUG nova.compute.provider_tree [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 980.238799] env[67893]: DEBUG nova.scheduler.client.report [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 980.255042] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.461s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 980.255042] env[67893]: DEBUG nova.compute.manager [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 980.295204] env[67893]: DEBUG nova.compute.utils [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 980.296423] env[67893]: DEBUG nova.compute.manager [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 980.296584] env[67893]: DEBUG nova.network.neutron [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 980.305824] env[67893]: DEBUG nova.compute.manager [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 980.357016] env[67893]: DEBUG nova.policy [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2a7cf83daef347089e10728559ab9d26', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '093685d267204cd99da54a398df3682b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 980.380099] env[67893]: DEBUG nova.compute.manager [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 980.406528] env[67893]: DEBUG nova.virt.hardware [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 980.406782] env[67893]: DEBUG nova.virt.hardware [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 980.406940] env[67893]: DEBUG nova.virt.hardware [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 980.407475] env[67893]: DEBUG nova.virt.hardware [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 980.407475] env[67893]: DEBUG nova.virt.hardware [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 980.407475] env[67893]: DEBUG nova.virt.hardware [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 980.407638] env[67893]: DEBUG nova.virt.hardware [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 980.407763] env[67893]: DEBUG nova.virt.hardware [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 980.407930] env[67893]: DEBUG nova.virt.hardware [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 980.408107] env[67893]: DEBUG nova.virt.hardware [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 980.408284] env[67893]: DEBUG nova.virt.hardware [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 980.409144] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d99bbbd4-55ac-43aa-a3f6-066ff11802dc {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 980.416981] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5bb1c7c0-4222-47d1-add7-f64d901a652d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 980.791732] env[67893]: DEBUG nova.network.neutron [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Successfully created port: b3de47bc-62b1-44b2-b6a9-c23d867c83fd {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 981.702629] env[67893]: DEBUG nova.compute.manager [req-ca59a07b-ab79-4a0a-a621-34aba619fc5d req-dbd137b6-5eb0-493f-9843-80035afcffe7 service nova] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Received event network-vif-plugged-b3de47bc-62b1-44b2-b6a9-c23d867c83fd {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 981.702629] env[67893]: DEBUG oslo_concurrency.lockutils [req-ca59a07b-ab79-4a0a-a621-34aba619fc5d req-dbd137b6-5eb0-493f-9843-80035afcffe7 service nova] Acquiring lock "2553f3c0-0988-4e11-a138-7e5f71e71f48-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 981.702629] env[67893]: DEBUG oslo_concurrency.lockutils [req-ca59a07b-ab79-4a0a-a621-34aba619fc5d req-dbd137b6-5eb0-493f-9843-80035afcffe7 service nova] Lock "2553f3c0-0988-4e11-a138-7e5f71e71f48-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 981.705141] env[67893]: DEBUG oslo_concurrency.lockutils [req-ca59a07b-ab79-4a0a-a621-34aba619fc5d req-dbd137b6-5eb0-493f-9843-80035afcffe7 service nova] Lock "2553f3c0-0988-4e11-a138-7e5f71e71f48-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 981.705141] env[67893]: DEBUG nova.compute.manager [req-ca59a07b-ab79-4a0a-a621-34aba619fc5d req-dbd137b6-5eb0-493f-9843-80035afcffe7 service nova] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] No waiting events found dispatching network-vif-plugged-b3de47bc-62b1-44b2-b6a9-c23d867c83fd {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 981.705141] env[67893]: WARNING nova.compute.manager [req-ca59a07b-ab79-4a0a-a621-34aba619fc5d req-dbd137b6-5eb0-493f-9843-80035afcffe7 service nova] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Received unexpected event network-vif-plugged-b3de47bc-62b1-44b2-b6a9-c23d867c83fd for instance with vm_state building and task_state spawning. [ 981.708856] env[67893]: DEBUG nova.network.neutron [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Successfully updated port: b3de47bc-62b1-44b2-b6a9-c23d867c83fd {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 981.723792] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Acquiring lock "refresh_cache-2553f3c0-0988-4e11-a138-7e5f71e71f48" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 981.723792] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Acquired lock "refresh_cache-2553f3c0-0988-4e11-a138-7e5f71e71f48" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 981.723792] env[67893]: DEBUG nova.network.neutron [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 981.777015] env[67893]: DEBUG nova.network.neutron [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 981.981394] env[67893]: DEBUG nova.network.neutron [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Updating instance_info_cache with network_info: [{"id": "b3de47bc-62b1-44b2-b6a9-c23d867c83fd", "address": "fa:16:3e:87:38:d7", "network": {"id": "f56dd9da-3db9-478d-84c5-e0354db59d15", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-732765990-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "093685d267204cd99da54a398df3682b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb3de47bc-62", "ovs_interfaceid": "b3de47bc-62b1-44b2-b6a9-c23d867c83fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 981.997495] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Releasing lock "refresh_cache-2553f3c0-0988-4e11-a138-7e5f71e71f48" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 981.997802] env[67893]: DEBUG nova.compute.manager [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Instance network_info: |[{"id": "b3de47bc-62b1-44b2-b6a9-c23d867c83fd", "address": "fa:16:3e:87:38:d7", "network": {"id": "f56dd9da-3db9-478d-84c5-e0354db59d15", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-732765990-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "093685d267204cd99da54a398df3682b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb3de47bc-62", "ovs_interfaceid": "b3de47bc-62b1-44b2-b6a9-c23d867c83fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 981.998214] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:87:38:d7', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '27138a4c-60c9-45fb-bf37-4c2f765315a3', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b3de47bc-62b1-44b2-b6a9-c23d867c83fd', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 982.009691] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Creating folder: Project (093685d267204cd99da54a398df3682b). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 982.010352] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-40695d9c-24aa-4c31-90a5-0b1999618479 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 982.024037] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Created folder: Project (093685d267204cd99da54a398df3682b) in parent group-v689771. [ 982.024238] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Creating folder: Instances. Parent ref: group-v689825. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 982.024464] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2f15cb80-cbaf-47eb-b046-974ed634b2a9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 982.035105] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Created folder: Instances in parent group-v689825. [ 982.035593] env[67893]: DEBUG oslo.service.loopingcall [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 982.035593] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 982.035762] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-271c54cf-5f0a-42fd-845f-9ff725c047d2 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 982.056601] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 982.056601] env[67893]: value = "task-3455373" [ 982.056601] env[67893]: _type = "Task" [ 982.056601] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 982.064406] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455373, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 982.567479] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455373, 'name': CreateVM_Task, 'duration_secs': 0.313609} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 982.567708] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 982.568332] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 982.568494] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 982.568840] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 982.569146] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-668c00e5-6618-4eb7-8c8f-d461f87814e8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 982.575286] env[67893]: DEBUG oslo_vmware.api [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Waiting for the task: (returnval){ [ 982.575286] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]522b0bb0-2f12-11a7-f645-217dec6ecacb" [ 982.575286] env[67893]: _type = "Task" [ 982.575286] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 982.583069] env[67893]: DEBUG oslo_vmware.api [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]522b0bb0-2f12-11a7-f645-217dec6ecacb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 983.085563] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 983.085833] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 983.087462] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 983.759242] env[67893]: DEBUG nova.compute.manager [req-4872334e-f3d9-424e-9dd8-ef5509a647a8 req-7fdc73a1-f588-43e9-a61f-d62cc7163341 service nova] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Received event network-changed-b3de47bc-62b1-44b2-b6a9-c23d867c83fd {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 983.759242] env[67893]: DEBUG nova.compute.manager [req-4872334e-f3d9-424e-9dd8-ef5509a647a8 req-7fdc73a1-f588-43e9-a61f-d62cc7163341 service nova] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Refreshing instance network info cache due to event network-changed-b3de47bc-62b1-44b2-b6a9-c23d867c83fd. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 983.759242] env[67893]: DEBUG oslo_concurrency.lockutils [req-4872334e-f3d9-424e-9dd8-ef5509a647a8 req-7fdc73a1-f588-43e9-a61f-d62cc7163341 service nova] Acquiring lock "refresh_cache-2553f3c0-0988-4e11-a138-7e5f71e71f48" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 983.759242] env[67893]: DEBUG oslo_concurrency.lockutils [req-4872334e-f3d9-424e-9dd8-ef5509a647a8 req-7fdc73a1-f588-43e9-a61f-d62cc7163341 service nova] Acquired lock "refresh_cache-2553f3c0-0988-4e11-a138-7e5f71e71f48" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 983.759242] env[67893]: DEBUG nova.network.neutron [req-4872334e-f3d9-424e-9dd8-ef5509a647a8 req-7fdc73a1-f588-43e9-a61f-d62cc7163341 service nova] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Refreshing network info cache for port b3de47bc-62b1-44b2-b6a9-c23d867c83fd {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 984.160627] env[67893]: DEBUG nova.network.neutron [req-4872334e-f3d9-424e-9dd8-ef5509a647a8 req-7fdc73a1-f588-43e9-a61f-d62cc7163341 service nova] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Updated VIF entry in instance network info cache for port b3de47bc-62b1-44b2-b6a9-c23d867c83fd. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 984.161035] env[67893]: DEBUG nova.network.neutron [req-4872334e-f3d9-424e-9dd8-ef5509a647a8 req-7fdc73a1-f588-43e9-a61f-d62cc7163341 service nova] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Updating instance_info_cache with network_info: [{"id": "b3de47bc-62b1-44b2-b6a9-c23d867c83fd", "address": "fa:16:3e:87:38:d7", "network": {"id": "f56dd9da-3db9-478d-84c5-e0354db59d15", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-732765990-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "093685d267204cd99da54a398df3682b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb3de47bc-62", "ovs_interfaceid": "b3de47bc-62b1-44b2-b6a9-c23d867c83fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 984.173702] env[67893]: DEBUG oslo_concurrency.lockutils [req-4872334e-f3d9-424e-9dd8-ef5509a647a8 req-7fdc73a1-f588-43e9-a61f-d62cc7163341 service nova] Releasing lock "refresh_cache-2553f3c0-0988-4e11-a138-7e5f71e71f48" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 988.559508] env[67893]: DEBUG oslo_concurrency.lockutils [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "5ede1991-efee-4c34-af5b-ce71f67456ef" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 988.559880] env[67893]: DEBUG oslo_concurrency.lockutils [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "5ede1991-efee-4c34-af5b-ce71f67456ef" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 998.859614] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1000.858629] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1001.854061] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1001.858699] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1001.859024] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1001.859024] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1001.880676] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1001.880822] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1001.880949] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1001.881208] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1001.881362] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1001.881496] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1001.881632] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1001.881752] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1001.881867] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1001.881982] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1001.882120] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1001.882576] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1001.882716] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1002.859547] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1002.859852] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1003.858738] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1004.859728] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1004.870999] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1004.871255] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1004.871428] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1004.871606] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1004.872736] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d1f492d-fd38-4d65-9e83-43bbe0df1afa {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1004.881902] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1b1f381-30fc-493d-ad1a-327864af7690 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1004.895904] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c89924f-0eab-47bc-ab1f-2a777eec6c9c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1004.902221] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62ad2773-0d44-404e-99fe-caedf47682e0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1004.931778] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180966MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1004.931778] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1004.931778] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1005.002761] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2f69fae8-d060-4156-8880-071f5ee1f969 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1005.002761] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2256af1c-4ff8-46b9-b568-c25ce8886e5f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1005.002761] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 6520080a-8bf1-4803-9099-87c3ba6e28e4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1005.002912] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 19ab9782-9131-46ba-bbf2-cc021953046e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1005.003018] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2eb8d698-9436-4e91-bd10-5f5200415144 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1005.003151] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1005.003270] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1005.003388] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a9656a7e-8a7b-489e-9990-097c1e93e535 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1005.003505] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance fcae7119-6233-4a52-9e52-1147f2b10ddc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1005.003619] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2553f3c0-0988-4e11-a138-7e5f71e71f48 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1005.015336] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c05df6c1-e4c9-4276-9981-e80e584d540c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1005.026292] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 89e2963e-83e2-4e29-843d-7c15abdf78bc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1005.038422] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 88a48088-829d-40c1-85e1-6e78b8f5cea9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1005.049654] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5a24adaf-bced-4488-9ccb-fc996b2ba154 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1005.060151] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance d8f75420-059d-4af1-8545-b5c4f67f4fe3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1005.070087] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a6771d3c-90ff-4403-9124-e74d74256db8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1005.080915] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 993c926b-bdc5-4f7e-992a-aac8c658ea6c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1005.091402] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 429fffb6-8355-419d-8cbb-a406d723802b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1005.100947] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 8c9b4750-db4e-446b-b108-fc675c6f4c69 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1005.110296] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance aa02f91b-b125-42af-be9a-c565ed041288 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1005.119810] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance e720d161-0c76-47ab-8d24-e465109d6e8c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1005.128749] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5e7e68fb-f8a5-46c9-b0b1-9fbc96c82428 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1005.138295] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance f27780c8-3155-480a-bb3c-e93cdac254f2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1005.147068] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 15515d0f-d317-4cc3-a922-c8a64654f4b2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1005.155955] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c0e59ef6-c233-490f-ab69-ab198142590a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1005.164626] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance efdb0a7e-403d-4de5-8c09-72b9c8f9cd79 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1005.173380] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5ede1991-efee-4c34-af5b-ce71f67456ef has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1005.173603] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1005.173749] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1005.446641] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-adc3a677-98ac-4434-8548-70b70532da60 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.454092] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-729c1d56-48be-4049-b238-a7bb804d3885 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.484488] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-170f2de3-28ed-45f0-987e-6df1962e9e7b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.491692] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23602b4c-dcc2-4d18-a1db-d3fca41580d7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.504828] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1005.517021] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1005.530090] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1005.530314] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.599s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1025.700757] env[67893]: WARNING oslo_vmware.rw_handles [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1025.700757] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1025.700757] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1025.700757] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1025.700757] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1025.700757] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 1025.700757] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1025.700757] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1025.700757] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1025.700757] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1025.700757] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1025.700757] env[67893]: ERROR oslo_vmware.rw_handles [ 1025.701410] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/79868114-00a0-44b2-b68b-809be3dd06e8/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1025.703010] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1025.703258] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Copying Virtual Disk [datastore1] vmware_temp/79868114-00a0-44b2-b68b-809be3dd06e8/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/79868114-00a0-44b2-b68b-809be3dd06e8/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1025.703542] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ed17d308-ae6d-44cf-a897-6f2875b57971 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1025.712656] env[67893]: DEBUG oslo_vmware.api [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Waiting for the task: (returnval){ [ 1025.712656] env[67893]: value = "task-3455374" [ 1025.712656] env[67893]: _type = "Task" [ 1025.712656] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1025.720630] env[67893]: DEBUG oslo_vmware.api [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Task: {'id': task-3455374, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1026.224685] env[67893]: DEBUG oslo_vmware.exceptions [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1026.224977] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1026.225540] env[67893]: ERROR nova.compute.manager [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1026.225540] env[67893]: Faults: ['InvalidArgument'] [ 1026.225540] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Traceback (most recent call last): [ 1026.225540] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1026.225540] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] yield resources [ 1026.225540] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1026.225540] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] self.driver.spawn(context, instance, image_meta, [ 1026.225540] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1026.225540] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1026.225540] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1026.225540] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] self._fetch_image_if_missing(context, vi) [ 1026.225540] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1026.225540] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] image_cache(vi, tmp_image_ds_loc) [ 1026.225540] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1026.225540] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] vm_util.copy_virtual_disk( [ 1026.225540] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1026.225540] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] session._wait_for_task(vmdk_copy_task) [ 1026.225540] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1026.225540] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] return self.wait_for_task(task_ref) [ 1026.225540] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1026.225540] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] return evt.wait() [ 1026.225540] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1026.225540] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] result = hub.switch() [ 1026.225540] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1026.225540] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] return self.greenlet.switch() [ 1026.225540] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1026.225540] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] self.f(*self.args, **self.kw) [ 1026.225540] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1026.225540] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] raise exceptions.translate_fault(task_info.error) [ 1026.225540] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1026.225540] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Faults: ['InvalidArgument'] [ 1026.225540] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] [ 1026.226433] env[67893]: INFO nova.compute.manager [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Terminating instance [ 1026.227932] env[67893]: DEBUG oslo_concurrency.lockutils [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1026.227932] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1026.227932] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-217ead8b-964c-4b82-83ad-1177970efd73 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1026.230091] env[67893]: DEBUG nova.compute.manager [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1026.230286] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1026.231025] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2a6c775-b9a1-446d-addc-20db133f4780 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1026.239387] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1026.239686] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c826a5e4-adf5-4080-b6ec-56418a842687 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1026.242315] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1026.242490] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1026.243437] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0fd28250-a9b4-413e-a54d-07c419b5b0fe {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1026.248540] env[67893]: DEBUG oslo_vmware.api [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Waiting for the task: (returnval){ [ 1026.248540] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52652409-af16-1993-5bd2-598a26b83e3b" [ 1026.248540] env[67893]: _type = "Task" [ 1026.248540] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1026.256050] env[67893]: DEBUG oslo_vmware.api [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52652409-af16-1993-5bd2-598a26b83e3b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1026.307998] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1026.308258] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1026.308438] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Deleting the datastore file [datastore1] 2f69fae8-d060-4156-8880-071f5ee1f969 {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1026.308693] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-80b31851-f591-4202-a3a2-e2c6e8bd750b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1026.315231] env[67893]: DEBUG oslo_vmware.api [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Waiting for the task: (returnval){ [ 1026.315231] env[67893]: value = "task-3455376" [ 1026.315231] env[67893]: _type = "Task" [ 1026.315231] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1026.322949] env[67893]: DEBUG oslo_vmware.api [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Task: {'id': task-3455376, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1026.759488] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1026.759896] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Creating directory with path [datastore1] vmware_temp/380aa38c-3f8f-4adc-92f7-04493259ba93/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1026.760013] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9211de99-bc1e-4c89-a540-db3a08083ef3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1026.770847] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Created directory with path [datastore1] vmware_temp/380aa38c-3f8f-4adc-92f7-04493259ba93/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1026.771056] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Fetch image to [datastore1] vmware_temp/380aa38c-3f8f-4adc-92f7-04493259ba93/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1026.771258] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/380aa38c-3f8f-4adc-92f7-04493259ba93/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1026.771920] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8c2f46a-f36f-4ee6-9460-569d93b5a0c0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1026.778804] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4aec1c97-08a5-46dd-ab09-fb26d04bded2 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1026.787616] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77312947-5e4a-4036-988e-071e57053daf {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1026.821057] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f7c7537-d6d3-4e38-abcc-c636112d4121 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1026.829507] env[67893]: DEBUG oslo_vmware.api [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Task: {'id': task-3455376, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073953} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1026.830637] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1026.830637] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1026.830637] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1026.830637] env[67893]: INFO nova.compute.manager [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1026.832025] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1982be13-0653-4815-a52b-d98c853ce667 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1026.833823] env[67893]: DEBUG nova.compute.claims [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1026.833996] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1026.834230] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1026.859421] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1026.917150] env[67893]: DEBUG oslo_vmware.rw_handles [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/380aa38c-3f8f-4adc-92f7-04493259ba93/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1026.977285] env[67893]: DEBUG oslo_vmware.rw_handles [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1026.977482] env[67893]: DEBUG oslo_vmware.rw_handles [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/380aa38c-3f8f-4adc-92f7-04493259ba93/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1027.234034] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-277c9f11-6722-4552-ab78-3b80c511f9f4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1027.241746] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf0a4fa2-5dff-4316-98dd-72f232fb98ab {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1027.272199] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff09668d-29d7-426f-9cc6-6b18bb0ddd6c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1027.279340] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c761759-7ba4-42fc-89ac-c5b71859f8f0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1027.292353] env[67893]: DEBUG nova.compute.provider_tree [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1027.301635] env[67893]: DEBUG nova.scheduler.client.report [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1027.318088] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.484s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1027.318634] env[67893]: ERROR nova.compute.manager [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1027.318634] env[67893]: Faults: ['InvalidArgument'] [ 1027.318634] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Traceback (most recent call last): [ 1027.318634] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1027.318634] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] self.driver.spawn(context, instance, image_meta, [ 1027.318634] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1027.318634] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1027.318634] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1027.318634] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] self._fetch_image_if_missing(context, vi) [ 1027.318634] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1027.318634] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] image_cache(vi, tmp_image_ds_loc) [ 1027.318634] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1027.318634] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] vm_util.copy_virtual_disk( [ 1027.318634] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1027.318634] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] session._wait_for_task(vmdk_copy_task) [ 1027.318634] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1027.318634] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] return self.wait_for_task(task_ref) [ 1027.318634] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1027.318634] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] return evt.wait() [ 1027.318634] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1027.318634] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] result = hub.switch() [ 1027.318634] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1027.318634] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] return self.greenlet.switch() [ 1027.318634] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1027.318634] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] self.f(*self.args, **self.kw) [ 1027.318634] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1027.318634] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] raise exceptions.translate_fault(task_info.error) [ 1027.318634] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1027.318634] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Faults: ['InvalidArgument'] [ 1027.318634] env[67893]: ERROR nova.compute.manager [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] [ 1027.319412] env[67893]: DEBUG nova.compute.utils [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1027.320800] env[67893]: DEBUG nova.compute.manager [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Build of instance 2f69fae8-d060-4156-8880-071f5ee1f969 was re-scheduled: A specified parameter was not correct: fileType [ 1027.320800] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1027.321193] env[67893]: DEBUG nova.compute.manager [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1027.321367] env[67893]: DEBUG nova.compute.manager [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1027.321538] env[67893]: DEBUG nova.compute.manager [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1027.321706] env[67893]: DEBUG nova.network.neutron [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1027.673716] env[67893]: DEBUG nova.network.neutron [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1027.687991] env[67893]: INFO nova.compute.manager [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Took 0.37 seconds to deallocate network for instance. [ 1027.805017] env[67893]: INFO nova.scheduler.client.report [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Deleted allocations for instance 2f69fae8-d060-4156-8880-071f5ee1f969 [ 1027.826030] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c0654902-721a-4864-9744-0ec6060e0acd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Lock "2f69fae8-d060-4156-8880-071f5ee1f969" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 381.847s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1027.826030] env[67893]: DEBUG oslo_concurrency.lockutils [None req-99c49e91-ec40-4634-8ecd-d9fc6369c4cd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Lock "2f69fae8-d060-4156-8880-071f5ee1f969" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 182.722s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1027.826030] env[67893]: DEBUG oslo_concurrency.lockutils [None req-99c49e91-ec40-4634-8ecd-d9fc6369c4cd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Acquiring lock "2f69fae8-d060-4156-8880-071f5ee1f969-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1027.826030] env[67893]: DEBUG oslo_concurrency.lockutils [None req-99c49e91-ec40-4634-8ecd-d9fc6369c4cd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Lock "2f69fae8-d060-4156-8880-071f5ee1f969-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1027.826030] env[67893]: DEBUG oslo_concurrency.lockutils [None req-99c49e91-ec40-4634-8ecd-d9fc6369c4cd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Lock "2f69fae8-d060-4156-8880-071f5ee1f969-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1027.827904] env[67893]: INFO nova.compute.manager [None req-99c49e91-ec40-4634-8ecd-d9fc6369c4cd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Terminating instance [ 1027.834570] env[67893]: DEBUG nova.compute.manager [None req-99c49e91-ec40-4634-8ecd-d9fc6369c4cd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1027.834570] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-99c49e91-ec40-4634-8ecd-d9fc6369c4cd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1027.834570] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-195b3e2c-4aee-46d8-b08a-f8148fa059b7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1027.841454] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-876c4053-0abc-4a9c-b33e-60234b550b24 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1027.852578] env[67893]: DEBUG nova.compute.manager [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1027.873625] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-99c49e91-ec40-4634-8ecd-d9fc6369c4cd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 2f69fae8-d060-4156-8880-071f5ee1f969 could not be found. [ 1027.873858] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-99c49e91-ec40-4634-8ecd-d9fc6369c4cd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1027.874076] env[67893]: INFO nova.compute.manager [None req-99c49e91-ec40-4634-8ecd-d9fc6369c4cd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1027.874288] env[67893]: DEBUG oslo.service.loopingcall [None req-99c49e91-ec40-4634-8ecd-d9fc6369c4cd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1027.874504] env[67893]: DEBUG nova.compute.manager [-] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1027.874617] env[67893]: DEBUG nova.network.neutron [-] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1027.915988] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1027.916182] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1027.917715] env[67893]: INFO nova.compute.claims [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1027.928838] env[67893]: DEBUG nova.network.neutron [-] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1027.936234] env[67893]: INFO nova.compute.manager [-] [instance: 2f69fae8-d060-4156-8880-071f5ee1f969] Took 0.06 seconds to deallocate network for instance. [ 1028.024938] env[67893]: DEBUG oslo_concurrency.lockutils [None req-99c49e91-ec40-4634-8ecd-d9fc6369c4cd tempest-ImagesOneServerNegativeTestJSON-780524237 tempest-ImagesOneServerNegativeTestJSON-780524237-project-member] Lock "2f69fae8-d060-4156-8880-071f5ee1f969" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.200s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1028.272678] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39a0c441-b9a7-409a-be8d-34f8bff09a1f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1028.280149] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae673275-8bf6-439f-9ee8-1ba688ada608 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1028.308651] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bb04cca-5cfe-4a51-9156-54d6c4a5a7ef {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1028.315457] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d8a0b31-0a02-4fc9-91fb-a1dd0bebc028 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1028.328325] env[67893]: DEBUG nova.compute.provider_tree [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1028.341379] env[67893]: DEBUG nova.scheduler.client.report [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1028.355464] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.439s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1028.355935] env[67893]: DEBUG nova.compute.manager [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1028.387918] env[67893]: DEBUG nova.compute.utils [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1028.388608] env[67893]: DEBUG nova.compute.manager [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1028.388782] env[67893]: DEBUG nova.network.neutron [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1028.397895] env[67893]: DEBUG nova.compute.manager [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1028.459070] env[67893]: DEBUG nova.policy [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c41cfe8c6978476d9a00f1240414e9e0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '368f4fa65bee4d6ba722728d1e453238', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 1028.467807] env[67893]: DEBUG nova.compute.manager [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1028.499459] env[67893]: DEBUG nova.virt.hardware [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1028.499719] env[67893]: DEBUG nova.virt.hardware [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1028.499903] env[67893]: DEBUG nova.virt.hardware [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1028.500068] env[67893]: DEBUG nova.virt.hardware [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1028.500215] env[67893]: DEBUG nova.virt.hardware [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1028.500396] env[67893]: DEBUG nova.virt.hardware [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1028.500584] env[67893]: DEBUG nova.virt.hardware [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1028.500773] env[67893]: DEBUG nova.virt.hardware [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1028.500952] env[67893]: DEBUG nova.virt.hardware [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1028.501130] env[67893]: DEBUG nova.virt.hardware [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1028.501301] env[67893]: DEBUG nova.virt.hardware [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1028.503504] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad0b7135-1ff6-48a8-8a61-e47305d3af4a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1028.511978] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2ad64f9-aa94-491a-bd21-99d70ba9110c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1028.898532] env[67893]: DEBUG nova.network.neutron [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Successfully created port: 561ec9a4-a764-4f1c-bcd0-e472b106499e {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1029.684695] env[67893]: DEBUG nova.network.neutron [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Successfully updated port: 561ec9a4-a764-4f1c-bcd0-e472b106499e {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1029.703738] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Acquiring lock "refresh_cache-c05df6c1-e4c9-4276-9981-e80e584d540c" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1029.703918] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Acquired lock "refresh_cache-c05df6c1-e4c9-4276-9981-e80e584d540c" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1029.704086] env[67893]: DEBUG nova.network.neutron [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1029.765567] env[67893]: DEBUG nova.network.neutron [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1029.960872] env[67893]: DEBUG nova.compute.manager [req-048810b2-6355-4c0a-bbe0-6b149d1f42c7 req-41ebe6f8-08f2-4986-9a23-b1a87d6e87ff service nova] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Received event network-vif-plugged-561ec9a4-a764-4f1c-bcd0-e472b106499e {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1029.962094] env[67893]: DEBUG oslo_concurrency.lockutils [req-048810b2-6355-4c0a-bbe0-6b149d1f42c7 req-41ebe6f8-08f2-4986-9a23-b1a87d6e87ff service nova] Acquiring lock "c05df6c1-e4c9-4276-9981-e80e584d540c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1029.962486] env[67893]: DEBUG oslo_concurrency.lockutils [req-048810b2-6355-4c0a-bbe0-6b149d1f42c7 req-41ebe6f8-08f2-4986-9a23-b1a87d6e87ff service nova] Lock "c05df6c1-e4c9-4276-9981-e80e584d540c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1029.965152] env[67893]: DEBUG oslo_concurrency.lockutils [req-048810b2-6355-4c0a-bbe0-6b149d1f42c7 req-41ebe6f8-08f2-4986-9a23-b1a87d6e87ff service nova] Lock "c05df6c1-e4c9-4276-9981-e80e584d540c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1029.965152] env[67893]: DEBUG nova.compute.manager [req-048810b2-6355-4c0a-bbe0-6b149d1f42c7 req-41ebe6f8-08f2-4986-9a23-b1a87d6e87ff service nova] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] No waiting events found dispatching network-vif-plugged-561ec9a4-a764-4f1c-bcd0-e472b106499e {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1029.965152] env[67893]: WARNING nova.compute.manager [req-048810b2-6355-4c0a-bbe0-6b149d1f42c7 req-41ebe6f8-08f2-4986-9a23-b1a87d6e87ff service nova] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Received unexpected event network-vif-plugged-561ec9a4-a764-4f1c-bcd0-e472b106499e for instance with vm_state building and task_state spawning. [ 1029.965152] env[67893]: DEBUG nova.compute.manager [req-048810b2-6355-4c0a-bbe0-6b149d1f42c7 req-41ebe6f8-08f2-4986-9a23-b1a87d6e87ff service nova] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Received event network-changed-561ec9a4-a764-4f1c-bcd0-e472b106499e {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1029.965152] env[67893]: DEBUG nova.compute.manager [req-048810b2-6355-4c0a-bbe0-6b149d1f42c7 req-41ebe6f8-08f2-4986-9a23-b1a87d6e87ff service nova] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Refreshing instance network info cache due to event network-changed-561ec9a4-a764-4f1c-bcd0-e472b106499e. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1029.965152] env[67893]: DEBUG oslo_concurrency.lockutils [req-048810b2-6355-4c0a-bbe0-6b149d1f42c7 req-41ebe6f8-08f2-4986-9a23-b1a87d6e87ff service nova] Acquiring lock "refresh_cache-c05df6c1-e4c9-4276-9981-e80e584d540c" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1029.988320] env[67893]: DEBUG nova.network.neutron [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Updating instance_info_cache with network_info: [{"id": "561ec9a4-a764-4f1c-bcd0-e472b106499e", "address": "fa:16:3e:26:0f:92", "network": {"id": "673c23ac-6063-4c40-b811-8f7e16de42e3", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1270873717-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "368f4fa65bee4d6ba722728d1e453238", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9e6b7d9f-c4e9-4623-9eb5-840ca1a8224c", "external-id": "nsx-vlan-transportzone-782", "segmentation_id": 782, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap561ec9a4-a7", "ovs_interfaceid": "561ec9a4-a764-4f1c-bcd0-e472b106499e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1030.006022] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Releasing lock "refresh_cache-c05df6c1-e4c9-4276-9981-e80e584d540c" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1030.006022] env[67893]: DEBUG nova.compute.manager [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Instance network_info: |[{"id": "561ec9a4-a764-4f1c-bcd0-e472b106499e", "address": "fa:16:3e:26:0f:92", "network": {"id": "673c23ac-6063-4c40-b811-8f7e16de42e3", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1270873717-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "368f4fa65bee4d6ba722728d1e453238", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9e6b7d9f-c4e9-4623-9eb5-840ca1a8224c", "external-id": "nsx-vlan-transportzone-782", "segmentation_id": 782, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap561ec9a4-a7", "ovs_interfaceid": "561ec9a4-a764-4f1c-bcd0-e472b106499e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1030.006022] env[67893]: DEBUG oslo_concurrency.lockutils [req-048810b2-6355-4c0a-bbe0-6b149d1f42c7 req-41ebe6f8-08f2-4986-9a23-b1a87d6e87ff service nova] Acquired lock "refresh_cache-c05df6c1-e4c9-4276-9981-e80e584d540c" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1030.006022] env[67893]: DEBUG nova.network.neutron [req-048810b2-6355-4c0a-bbe0-6b149d1f42c7 req-41ebe6f8-08f2-4986-9a23-b1a87d6e87ff service nova] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Refreshing network info cache for port 561ec9a4-a764-4f1c-bcd0-e472b106499e {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1030.007190] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:26:0f:92', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '9e6b7d9f-c4e9-4623-9eb5-840ca1a8224c', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '561ec9a4-a764-4f1c-bcd0-e472b106499e', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1030.021222] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Creating folder: Project (368f4fa65bee4d6ba722728d1e453238). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1030.023120] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4ff72bc9-63be-43a3-99db-a73726d824e6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1030.040399] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Created folder: Project (368f4fa65bee4d6ba722728d1e453238) in parent group-v689771. [ 1030.040965] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Creating folder: Instances. Parent ref: group-v689828. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1030.041483] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-65cf96c1-a125-4bbb-8596-64e29b7a0687 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1030.053031] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Created folder: Instances in parent group-v689828. [ 1030.053031] env[67893]: DEBUG oslo.service.loopingcall [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1030.053031] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1030.053031] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b6280b98-5e8e-40ab-afe6-ba55ae2ef888 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1030.078352] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1030.078352] env[67893]: value = "task-3455379" [ 1030.078352] env[67893]: _type = "Task" [ 1030.078352] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1030.086187] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455379, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1030.589226] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455379, 'name': CreateVM_Task, 'duration_secs': 0.286448} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1030.589226] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1030.590253] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1030.590579] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1030.594470] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1030.594470] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-60475a96-4ae9-4cb9-80d6-92426dfd962d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1030.599185] env[67893]: DEBUG oslo_vmware.api [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Waiting for the task: (returnval){ [ 1030.599185] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]524d51d1-a5d1-6933-e67e-ef410c41d383" [ 1030.599185] env[67893]: _type = "Task" [ 1030.599185] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1030.605951] env[67893]: DEBUG oslo_vmware.api [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]524d51d1-a5d1-6933-e67e-ef410c41d383, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1030.621759] env[67893]: DEBUG nova.network.neutron [req-048810b2-6355-4c0a-bbe0-6b149d1f42c7 req-41ebe6f8-08f2-4986-9a23-b1a87d6e87ff service nova] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Updated VIF entry in instance network info cache for port 561ec9a4-a764-4f1c-bcd0-e472b106499e. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1030.621759] env[67893]: DEBUG nova.network.neutron [req-048810b2-6355-4c0a-bbe0-6b149d1f42c7 req-41ebe6f8-08f2-4986-9a23-b1a87d6e87ff service nova] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Updating instance_info_cache with network_info: [{"id": "561ec9a4-a764-4f1c-bcd0-e472b106499e", "address": "fa:16:3e:26:0f:92", "network": {"id": "673c23ac-6063-4c40-b811-8f7e16de42e3", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1270873717-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "368f4fa65bee4d6ba722728d1e453238", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9e6b7d9f-c4e9-4623-9eb5-840ca1a8224c", "external-id": "nsx-vlan-transportzone-782", "segmentation_id": 782, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap561ec9a4-a7", "ovs_interfaceid": "561ec9a4-a764-4f1c-bcd0-e472b106499e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1030.644205] env[67893]: DEBUG oslo_concurrency.lockutils [req-048810b2-6355-4c0a-bbe0-6b149d1f42c7 req-41ebe6f8-08f2-4986-9a23-b1a87d6e87ff service nova] Releasing lock "refresh_cache-c05df6c1-e4c9-4276-9981-e80e584d540c" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1031.108747] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1031.109029] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1031.109245] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1038.513506] env[67893]: DEBUG oslo_concurrency.lockutils [None req-9f5c6c77-f230-4394-887b-bf3f4adfc47a tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Acquiring lock "2553f3c0-0988-4e11-a138-7e5f71e71f48" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1043.790614] env[67893]: DEBUG oslo_concurrency.lockutils [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquiring lock "b3d31ca3-9a7a-49d0-955f-1e12808bf11f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1043.790884] env[67893]: DEBUG oslo_concurrency.lockutils [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "b3d31ca3-9a7a-49d0-955f-1e12808bf11f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1049.791354] env[67893]: DEBUG oslo_concurrency.lockutils [None req-48dbfc0d-a487-436f-a843-7c89d5b6f827 tempest-ServerTagsTestJSON-1065957766 tempest-ServerTagsTestJSON-1065957766-project-member] Acquiring lock "14f7c0cf-cbf0-4090-89a9-45fe4485cf31" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1049.791642] env[67893]: DEBUG oslo_concurrency.lockutils [None req-48dbfc0d-a487-436f-a843-7c89d5b6f827 tempest-ServerTagsTestJSON-1065957766 tempest-ServerTagsTestJSON-1065957766-project-member] Lock "14f7c0cf-cbf0-4090-89a9-45fe4485cf31" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1052.722642] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c945fa0c-71c6-49d1-80f9-8ae90c0c7469 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Acquiring lock "19ab9782-9131-46ba-bbf2-cc021953046e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1055.398682] env[67893]: DEBUG oslo_concurrency.lockutils [None req-08aff738-2c74-4d6d-b12a-8c96c6e80fb3 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Acquiring lock "d41abc6b-6519-4994-aa17-6b6bd94c93d9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1055.399047] env[67893]: DEBUG oslo_concurrency.lockutils [None req-08aff738-2c74-4d6d-b12a-8c96c6e80fb3 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Lock "d41abc6b-6519-4994-aa17-6b6bd94c93d9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1057.511213] env[67893]: DEBUG oslo_concurrency.lockutils [None req-9b391c00-097d-4240-bc00-1283fadbfb79 tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Acquiring lock "c05df6c1-e4c9-4276-9981-e80e584d540c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1061.032021] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1b4a6e13-dc09-48fa-9689-59668a4684f3 tempest-ServersV294TestFqdnHostnames-1022455702 tempest-ServersV294TestFqdnHostnames-1022455702-project-member] Acquiring lock "42938110-2d23-432a-bdb2-30750dac90b4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1061.032325] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1b4a6e13-dc09-48fa-9689-59668a4684f3 tempest-ServersV294TestFqdnHostnames-1022455702 tempest-ServersV294TestFqdnHostnames-1022455702-project-member] Lock "42938110-2d23-432a-bdb2-30750dac90b4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1061.530130] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1061.859227] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1061.859403] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1061.859526] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1061.882539] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1061.882725] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1061.882858] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1061.882983] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1061.883122] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1061.883343] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1061.883381] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1061.883490] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1061.883607] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1061.883721] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1061.883838] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1061.884347] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1062.859107] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1063.854717] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1063.854857] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1063.877278] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1063.877558] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1064.859699] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1064.859871] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1064.860080] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1064.871202] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1064.871426] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1064.871629] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1064.871849] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1064.873050] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36191ee8-c3d5-4b2f-931b-81bb13cf8f9f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1064.882206] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18326370-91ae-4047-912a-e82984b83280 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1064.896086] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35e8e11d-032e-42b8-ab55-1c2167d70ed1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1064.902287] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e60e8352-cb66-4d6c-9727-328c25ed9cb3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1064.933055] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180992MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1064.933055] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1064.933055] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1065.005297] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2256af1c-4ff8-46b9-b568-c25ce8886e5f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1065.005512] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 6520080a-8bf1-4803-9099-87c3ba6e28e4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1065.005689] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 19ab9782-9131-46ba-bbf2-cc021953046e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1065.005851] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2eb8d698-9436-4e91-bd10-5f5200415144 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1065.006023] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1065.006185] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1065.006344] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a9656a7e-8a7b-489e-9990-097c1e93e535 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1065.006499] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance fcae7119-6233-4a52-9e52-1147f2b10ddc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1065.006659] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2553f3c0-0988-4e11-a138-7e5f71e71f48 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1065.006817] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c05df6c1-e4c9-4276-9981-e80e584d540c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1065.018466] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 89e2963e-83e2-4e29-843d-7c15abdf78bc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1065.031459] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 88a48088-829d-40c1-85e1-6e78b8f5cea9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1065.042023] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5a24adaf-bced-4488-9ccb-fc996b2ba154 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1065.053544] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance d8f75420-059d-4af1-8545-b5c4f67f4fe3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1065.063477] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a6771d3c-90ff-4403-9124-e74d74256db8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1065.074422] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 993c926b-bdc5-4f7e-992a-aac8c658ea6c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1065.086203] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 429fffb6-8355-419d-8cbb-a406d723802b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1065.096176] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 8c9b4750-db4e-446b-b108-fc675c6f4c69 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1065.108052] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance aa02f91b-b125-42af-be9a-c565ed041288 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1065.117684] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance e720d161-0c76-47ab-8d24-e465109d6e8c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1065.128473] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5e7e68fb-f8a5-46c9-b0b1-9fbc96c82428 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1065.139187] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance f27780c8-3155-480a-bb3c-e93cdac254f2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1065.149039] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 15515d0f-d317-4cc3-a922-c8a64654f4b2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1065.158209] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c0e59ef6-c233-490f-ab69-ab198142590a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1065.167612] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance efdb0a7e-403d-4de5-8c09-72b9c8f9cd79 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1065.176676] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5ede1991-efee-4c34-af5b-ce71f67456ef has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1065.185914] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance b3d31ca3-9a7a-49d0-955f-1e12808bf11f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1065.196499] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 14f7c0cf-cbf0-4090-89a9-45fe4485cf31 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1065.207814] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance d41abc6b-6519-4994-aa17-6b6bd94c93d9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1065.216710] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 42938110-2d23-432a-bdb2-30750dac90b4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1065.216954] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1065.217114] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1065.536697] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27e444d8-b54f-411c-99a7-bcb8b3fcff3b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1065.544479] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5324d87f-42af-41b9-a7ca-6c8c4cb39aed {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1065.575487] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90b1adee-b621-49d5-9fd6-20c7df6daeb5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1065.583168] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c4ac9e1-20a7-4d65-8f5a-f1442c6ee75d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1065.596470] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1065.606790] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1065.619814] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1065.620014] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.687s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1073.636407] env[67893]: WARNING oslo_vmware.rw_handles [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1073.636407] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1073.636407] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1073.636407] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1073.636407] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1073.636407] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 1073.636407] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1073.636407] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1073.636407] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1073.636407] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1073.636407] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1073.636407] env[67893]: ERROR oslo_vmware.rw_handles [ 1073.636975] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/380aa38c-3f8f-4adc-92f7-04493259ba93/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1073.639086] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1073.639340] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Copying Virtual Disk [datastore1] vmware_temp/380aa38c-3f8f-4adc-92f7-04493259ba93/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/380aa38c-3f8f-4adc-92f7-04493259ba93/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1073.639629] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-42c6c306-bdce-450f-82c9-6586bfab42dd {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1073.647077] env[67893]: DEBUG oslo_vmware.api [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Waiting for the task: (returnval){ [ 1073.647077] env[67893]: value = "task-3455380" [ 1073.647077] env[67893]: _type = "Task" [ 1073.647077] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1073.655150] env[67893]: DEBUG oslo_vmware.api [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Task: {'id': task-3455380, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1074.158211] env[67893]: DEBUG oslo_vmware.exceptions [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1074.158518] env[67893]: DEBUG oslo_concurrency.lockutils [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1074.159085] env[67893]: ERROR nova.compute.manager [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1074.159085] env[67893]: Faults: ['InvalidArgument'] [ 1074.159085] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Traceback (most recent call last): [ 1074.159085] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1074.159085] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] yield resources [ 1074.159085] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1074.159085] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] self.driver.spawn(context, instance, image_meta, [ 1074.159085] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1074.159085] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1074.159085] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1074.159085] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] self._fetch_image_if_missing(context, vi) [ 1074.159085] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1074.159085] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] image_cache(vi, tmp_image_ds_loc) [ 1074.159085] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1074.159085] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] vm_util.copy_virtual_disk( [ 1074.159085] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1074.159085] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] session._wait_for_task(vmdk_copy_task) [ 1074.159085] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1074.159085] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] return self.wait_for_task(task_ref) [ 1074.159085] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1074.159085] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] return evt.wait() [ 1074.159085] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1074.159085] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] result = hub.switch() [ 1074.159085] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1074.159085] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] return self.greenlet.switch() [ 1074.159085] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1074.159085] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] self.f(*self.args, **self.kw) [ 1074.159085] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1074.159085] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] raise exceptions.translate_fault(task_info.error) [ 1074.159085] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1074.159085] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Faults: ['InvalidArgument'] [ 1074.159085] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] [ 1074.159947] env[67893]: INFO nova.compute.manager [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Terminating instance [ 1074.161134] env[67893]: DEBUG oslo_concurrency.lockutils [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1074.161234] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1074.161927] env[67893]: DEBUG nova.compute.manager [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1074.162130] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1074.162354] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-28b9e141-1162-41b0-9b95-f4e03e9d904f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1074.164717] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bced785-5fce-40da-9834-b542b7aa3a97 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1074.171432] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1074.171639] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c806fa15-8729-47fa-9a84-261541b0772e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1074.174011] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1074.174191] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1074.175107] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a5acf56f-de57-4aaa-9512-6b4ff12c9dea {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1074.179738] env[67893]: DEBUG oslo_vmware.api [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Waiting for the task: (returnval){ [ 1074.179738] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52a33474-335e-2070-f73d-6bf17cbe1339" [ 1074.179738] env[67893]: _type = "Task" [ 1074.179738] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1074.188806] env[67893]: DEBUG oslo_vmware.api [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52a33474-335e-2070-f73d-6bf17cbe1339, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1074.249019] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1074.249272] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1074.249452] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Deleting the datastore file [datastore1] 2256af1c-4ff8-46b9-b568-c25ce8886e5f {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1074.249710] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6ab7dba7-09f8-4ed1-8fd1-3c0e36b65170 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1074.256145] env[67893]: DEBUG oslo_vmware.api [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Waiting for the task: (returnval){ [ 1074.256145] env[67893]: value = "task-3455382" [ 1074.256145] env[67893]: _type = "Task" [ 1074.256145] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1074.264871] env[67893]: DEBUG oslo_vmware.api [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Task: {'id': task-3455382, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1074.689462] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1074.689776] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Creating directory with path [datastore1] vmware_temp/74cbe3f2-97b0-41d3-b1ae-e1ebfc9c8399/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1074.689917] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b0a8ea75-74eb-4470-8378-0f2ca9196ce2 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1074.701802] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Created directory with path [datastore1] vmware_temp/74cbe3f2-97b0-41d3-b1ae-e1ebfc9c8399/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1074.702017] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Fetch image to [datastore1] vmware_temp/74cbe3f2-97b0-41d3-b1ae-e1ebfc9c8399/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1074.702208] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/74cbe3f2-97b0-41d3-b1ae-e1ebfc9c8399/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1074.702926] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efc2f827-b746-4fa7-9160-22e3cf360565 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1074.709263] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b93d965-7d5c-4fd6-8367-4f199ab16b34 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1074.717865] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc2e32bc-f040-42e5-b1d1-a6ab3f2a24b9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1074.747379] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ea70d89-1b30-4400-aae0-6be76d7c3289 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1074.752849] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0d804799-bf8e-4bbf-b8b3-1313911e0276 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1074.764016] env[67893]: DEBUG oslo_vmware.api [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Task: {'id': task-3455382, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066147} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1074.764271] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1074.764455] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1074.764661] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1074.764819] env[67893]: INFO nova.compute.manager [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1074.767097] env[67893]: DEBUG nova.compute.claims [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1074.767274] env[67893]: DEBUG oslo_concurrency.lockutils [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1074.767492] env[67893]: DEBUG oslo_concurrency.lockutils [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1074.776349] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1074.832337] env[67893]: DEBUG oslo_vmware.rw_handles [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/74cbe3f2-97b0-41d3-b1ae-e1ebfc9c8399/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1074.892553] env[67893]: DEBUG oslo_vmware.rw_handles [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1074.892743] env[67893]: DEBUG oslo_vmware.rw_handles [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/74cbe3f2-97b0-41d3-b1ae-e1ebfc9c8399/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1075.190972] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1cb6d8fa-d59c-4903-a29c-36c27cb01357 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1075.198458] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7dfee498-b0c7-4fe3-8752-89ec185abfb4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1075.228577] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd3ffc6d-bf4e-4ee6-815f-a084399bb4b2 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1075.235607] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd0ff14c-25ea-4254-b7fc-fc361ae42e2b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1075.248338] env[67893]: DEBUG nova.compute.provider_tree [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1075.256977] env[67893]: DEBUG nova.scheduler.client.report [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1075.270871] env[67893]: DEBUG oslo_concurrency.lockutils [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.503s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1075.271424] env[67893]: ERROR nova.compute.manager [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1075.271424] env[67893]: Faults: ['InvalidArgument'] [ 1075.271424] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Traceback (most recent call last): [ 1075.271424] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1075.271424] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] self.driver.spawn(context, instance, image_meta, [ 1075.271424] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1075.271424] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1075.271424] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1075.271424] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] self._fetch_image_if_missing(context, vi) [ 1075.271424] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1075.271424] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] image_cache(vi, tmp_image_ds_loc) [ 1075.271424] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1075.271424] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] vm_util.copy_virtual_disk( [ 1075.271424] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1075.271424] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] session._wait_for_task(vmdk_copy_task) [ 1075.271424] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1075.271424] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] return self.wait_for_task(task_ref) [ 1075.271424] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1075.271424] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] return evt.wait() [ 1075.271424] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1075.271424] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] result = hub.switch() [ 1075.271424] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1075.271424] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] return self.greenlet.switch() [ 1075.271424] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1075.271424] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] self.f(*self.args, **self.kw) [ 1075.271424] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1075.271424] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] raise exceptions.translate_fault(task_info.error) [ 1075.271424] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1075.271424] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Faults: ['InvalidArgument'] [ 1075.271424] env[67893]: ERROR nova.compute.manager [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] [ 1075.272186] env[67893]: DEBUG nova.compute.utils [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1075.273868] env[67893]: DEBUG nova.compute.manager [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Build of instance 2256af1c-4ff8-46b9-b568-c25ce8886e5f was re-scheduled: A specified parameter was not correct: fileType [ 1075.273868] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1075.274252] env[67893]: DEBUG nova.compute.manager [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1075.274425] env[67893]: DEBUG nova.compute.manager [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1075.274580] env[67893]: DEBUG nova.compute.manager [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1075.274742] env[67893]: DEBUG nova.network.neutron [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1075.616180] env[67893]: DEBUG nova.network.neutron [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1075.628886] env[67893]: INFO nova.compute.manager [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Took 0.35 seconds to deallocate network for instance. [ 1075.758147] env[67893]: INFO nova.scheduler.client.report [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Deleted allocations for instance 2256af1c-4ff8-46b9-b568-c25ce8886e5f [ 1075.782514] env[67893]: DEBUG oslo_concurrency.lockutils [None req-515cf0ea-0034-47e0-a565-c736ab3f438b tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Lock "2256af1c-4ff8-46b9-b568-c25ce8886e5f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 425.282s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1075.783699] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d1fed5f6-c1f8-4bb0-9825-ba0b473b5a13 tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Lock "2256af1c-4ff8-46b9-b568-c25ce8886e5f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 226.474s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1075.783913] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d1fed5f6-c1f8-4bb0-9825-ba0b473b5a13 tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Acquiring lock "2256af1c-4ff8-46b9-b568-c25ce8886e5f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1075.784132] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d1fed5f6-c1f8-4bb0-9825-ba0b473b5a13 tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Lock "2256af1c-4ff8-46b9-b568-c25ce8886e5f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1075.784299] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d1fed5f6-c1f8-4bb0-9825-ba0b473b5a13 tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Lock "2256af1c-4ff8-46b9-b568-c25ce8886e5f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1075.786244] env[67893]: INFO nova.compute.manager [None req-d1fed5f6-c1f8-4bb0-9825-ba0b473b5a13 tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Terminating instance [ 1075.787866] env[67893]: DEBUG nova.compute.manager [None req-d1fed5f6-c1f8-4bb0-9825-ba0b473b5a13 tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1075.788096] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-d1fed5f6-c1f8-4bb0-9825-ba0b473b5a13 tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1075.788698] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f90a9379-3239-4315-a215-1b7c4523043d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1075.799109] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-574e1688-d589-410f-bf47-87467baef7bd {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1075.809685] env[67893]: DEBUG nova.compute.manager [None req-905c3871-724a-4a60-b48e-9dd9d69b26de tempest-ServerShowV247Test-307814888 tempest-ServerShowV247Test-307814888-project-member] [instance: 89e2963e-83e2-4e29-843d-7c15abdf78bc] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1075.832047] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-d1fed5f6-c1f8-4bb0-9825-ba0b473b5a13 tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 2256af1c-4ff8-46b9-b568-c25ce8886e5f could not be found. [ 1075.832047] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-d1fed5f6-c1f8-4bb0-9825-ba0b473b5a13 tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1075.832047] env[67893]: INFO nova.compute.manager [None req-d1fed5f6-c1f8-4bb0-9825-ba0b473b5a13 tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1075.832047] env[67893]: DEBUG oslo.service.loopingcall [None req-d1fed5f6-c1f8-4bb0-9825-ba0b473b5a13 tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1075.832047] env[67893]: DEBUG nova.compute.manager [-] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1075.832047] env[67893]: DEBUG nova.network.neutron [-] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1075.835811] env[67893]: DEBUG nova.compute.manager [None req-905c3871-724a-4a60-b48e-9dd9d69b26de tempest-ServerShowV247Test-307814888 tempest-ServerShowV247Test-307814888-project-member] [instance: 89e2963e-83e2-4e29-843d-7c15abdf78bc] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1075.859224] env[67893]: DEBUG oslo_concurrency.lockutils [None req-905c3871-724a-4a60-b48e-9dd9d69b26de tempest-ServerShowV247Test-307814888 tempest-ServerShowV247Test-307814888-project-member] Lock "89e2963e-83e2-4e29-843d-7c15abdf78bc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 205.226s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1075.860572] env[67893]: DEBUG nova.network.neutron [-] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1075.868442] env[67893]: DEBUG nova.compute.manager [None req-7beee04a-5161-4269-b89d-15109176b7f5 tempest-ServerShowV247Test-307814888 tempest-ServerShowV247Test-307814888-project-member] [instance: 88a48088-829d-40c1-85e1-6e78b8f5cea9] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1075.872022] env[67893]: INFO nova.compute.manager [-] [instance: 2256af1c-4ff8-46b9-b568-c25ce8886e5f] Took 0.04 seconds to deallocate network for instance. [ 1075.890052] env[67893]: DEBUG nova.compute.manager [None req-7beee04a-5161-4269-b89d-15109176b7f5 tempest-ServerShowV247Test-307814888 tempest-ServerShowV247Test-307814888-project-member] [instance: 88a48088-829d-40c1-85e1-6e78b8f5cea9] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1075.910835] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7beee04a-5161-4269-b89d-15109176b7f5 tempest-ServerShowV247Test-307814888 tempest-ServerShowV247Test-307814888-project-member] Lock "88a48088-829d-40c1-85e1-6e78b8f5cea9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 203.344s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1075.920767] env[67893]: DEBUG nova.compute.manager [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1075.958390] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d1fed5f6-c1f8-4bb0-9825-ba0b473b5a13 tempest-TenantUsagesTestJSON-1358755841 tempest-TenantUsagesTestJSON-1358755841-project-member] Lock "2256af1c-4ff8-46b9-b568-c25ce8886e5f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.175s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1075.965516] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1075.965768] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1075.967173] env[67893]: INFO nova.compute.claims [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1076.314310] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d122736-4e85-47c3-bb78-d49ee12e30cb {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1076.321543] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31e12da2-5305-49ab-b665-2098f4a13756 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1076.356566] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c277821-e9f8-4fc4-99a3-70366f6ed2ff {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1076.364217] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8421cda7-a6e6-48fa-8499-0c25e2c7af88 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1076.378618] env[67893]: DEBUG nova.compute.provider_tree [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1076.387205] env[67893]: DEBUG nova.scheduler.client.report [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1076.401037] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.435s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1076.401446] env[67893]: DEBUG nova.compute.manager [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1076.434668] env[67893]: DEBUG nova.compute.utils [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1076.435749] env[67893]: DEBUG nova.compute.manager [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1076.435940] env[67893]: DEBUG nova.network.neutron [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1076.445272] env[67893]: DEBUG nova.compute.manager [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1076.491573] env[67893]: DEBUG nova.policy [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e1a872438ccd426db84cec76c2036435', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a8907299f165470d94c58acc601fd86f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 1076.508408] env[67893]: DEBUG nova.compute.manager [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1076.537350] env[67893]: DEBUG nova.virt.hardware [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1076.537592] env[67893]: DEBUG nova.virt.hardware [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1076.537748] env[67893]: DEBUG nova.virt.hardware [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1076.537927] env[67893]: DEBUG nova.virt.hardware [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1076.538089] env[67893]: DEBUG nova.virt.hardware [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1076.538237] env[67893]: DEBUG nova.virt.hardware [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1076.538442] env[67893]: DEBUG nova.virt.hardware [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1076.538598] env[67893]: DEBUG nova.virt.hardware [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1076.538765] env[67893]: DEBUG nova.virt.hardware [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1076.538923] env[67893]: DEBUG nova.virt.hardware [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1076.539107] env[67893]: DEBUG nova.virt.hardware [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1076.539981] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd8cf995-5f02-4219-b6d3-007066f9f11b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1076.548132] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9aaba2e-2ed2-4bdd-893d-fdd20f3554b2 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1076.880968] env[67893]: DEBUG nova.network.neutron [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Successfully created port: 03eb9822-1e5f-4573-be43-4782c7879409 {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1077.719019] env[67893]: DEBUG nova.compute.manager [req-1cce8644-2a03-46df-a250-cfc2b863cb1c req-43ef7b99-1956-4f66-a26c-919e6a94e399 service nova] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Received event network-vif-plugged-03eb9822-1e5f-4573-be43-4782c7879409 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1077.719256] env[67893]: DEBUG oslo_concurrency.lockutils [req-1cce8644-2a03-46df-a250-cfc2b863cb1c req-43ef7b99-1956-4f66-a26c-919e6a94e399 service nova] Acquiring lock "5a24adaf-bced-4488-9ccb-fc996b2ba154-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1077.719481] env[67893]: DEBUG oslo_concurrency.lockutils [req-1cce8644-2a03-46df-a250-cfc2b863cb1c req-43ef7b99-1956-4f66-a26c-919e6a94e399 service nova] Lock "5a24adaf-bced-4488-9ccb-fc996b2ba154-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1077.719688] env[67893]: DEBUG oslo_concurrency.lockutils [req-1cce8644-2a03-46df-a250-cfc2b863cb1c req-43ef7b99-1956-4f66-a26c-919e6a94e399 service nova] Lock "5a24adaf-bced-4488-9ccb-fc996b2ba154-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1077.719864] env[67893]: DEBUG nova.compute.manager [req-1cce8644-2a03-46df-a250-cfc2b863cb1c req-43ef7b99-1956-4f66-a26c-919e6a94e399 service nova] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] No waiting events found dispatching network-vif-plugged-03eb9822-1e5f-4573-be43-4782c7879409 {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1077.720041] env[67893]: WARNING nova.compute.manager [req-1cce8644-2a03-46df-a250-cfc2b863cb1c req-43ef7b99-1956-4f66-a26c-919e6a94e399 service nova] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Received unexpected event network-vif-plugged-03eb9822-1e5f-4573-be43-4782c7879409 for instance with vm_state building and task_state spawning. [ 1077.754048] env[67893]: DEBUG nova.network.neutron [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Successfully updated port: 03eb9822-1e5f-4573-be43-4782c7879409 {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1077.764850] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Acquiring lock "refresh_cache-5a24adaf-bced-4488-9ccb-fc996b2ba154" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1077.764952] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Acquired lock "refresh_cache-5a24adaf-bced-4488-9ccb-fc996b2ba154" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1077.765114] env[67893]: DEBUG nova.network.neutron [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1077.820601] env[67893]: DEBUG nova.network.neutron [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1078.057811] env[67893]: DEBUG nova.network.neutron [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Updating instance_info_cache with network_info: [{"id": "03eb9822-1e5f-4573-be43-4782c7879409", "address": "fa:16:3e:47:c6:19", "network": {"id": "84dc64d2-5c98-4c6f-a1ca-c3e9ccbd04c0", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1664609498-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a8907299f165470d94c58acc601fd86f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d7836a5b-a91e-4d3f-8e96-afe024f62bb5", "external-id": "nsx-vlan-transportzone-419", "segmentation_id": 419, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap03eb9822-1e", "ovs_interfaceid": "03eb9822-1e5f-4573-be43-4782c7879409", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1078.070991] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Releasing lock "refresh_cache-5a24adaf-bced-4488-9ccb-fc996b2ba154" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1078.071302] env[67893]: DEBUG nova.compute.manager [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Instance network_info: |[{"id": "03eb9822-1e5f-4573-be43-4782c7879409", "address": "fa:16:3e:47:c6:19", "network": {"id": "84dc64d2-5c98-4c6f-a1ca-c3e9ccbd04c0", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1664609498-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a8907299f165470d94c58acc601fd86f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d7836a5b-a91e-4d3f-8e96-afe024f62bb5", "external-id": "nsx-vlan-transportzone-419", "segmentation_id": 419, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap03eb9822-1e", "ovs_interfaceid": "03eb9822-1e5f-4573-be43-4782c7879409", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1078.072047] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:47:c6:19', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd7836a5b-a91e-4d3f-8e96-afe024f62bb5', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '03eb9822-1e5f-4573-be43-4782c7879409', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1078.079169] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Creating folder: Project (a8907299f165470d94c58acc601fd86f). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1078.079718] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-750aabb2-871a-430b-9f85-dbca81112458 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1078.090704] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Created folder: Project (a8907299f165470d94c58acc601fd86f) in parent group-v689771. [ 1078.090921] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Creating folder: Instances. Parent ref: group-v689831. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1078.091150] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2d750e1a-156b-4d29-935d-7fe2ff7018ba {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1078.099301] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Created folder: Instances in parent group-v689831. [ 1078.099531] env[67893]: DEBUG oslo.service.loopingcall [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1078.099715] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1078.099964] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-fac79a1e-97bd-4246-b3f4-a410c3e4edd2 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1078.119630] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1078.119630] env[67893]: value = "task-3455385" [ 1078.119630] env[67893]: _type = "Task" [ 1078.119630] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1078.126971] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455385, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1078.439363] env[67893]: DEBUG oslo_concurrency.lockutils [None req-789edfb5-549f-452d-8221-662354d8320b tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Acquiring lock "5a24adaf-bced-4488-9ccb-fc996b2ba154" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1078.629654] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455385, 'name': CreateVM_Task, 'duration_secs': 0.297415} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1078.629831] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1078.630521] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1078.630699] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1078.631044] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1078.631294] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5f28026f-6589-4087-a88c-82dbcb351561 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1078.635646] env[67893]: DEBUG oslo_vmware.api [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Waiting for the task: (returnval){ [ 1078.635646] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]524b7179-0519-6f71-59d7-be5389bd7495" [ 1078.635646] env[67893]: _type = "Task" [ 1078.635646] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1078.642861] env[67893]: DEBUG oslo_vmware.api [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]524b7179-0519-6f71-59d7-be5389bd7495, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1079.145876] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1079.146164] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1079.146351] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1079.743372] env[67893]: DEBUG nova.compute.manager [req-1ffc67f0-0804-4aab-9dc7-8df71328a4b6 req-7c640cb0-2067-4704-9426-0aa6a6854b90 service nova] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Received event network-changed-03eb9822-1e5f-4573-be43-4782c7879409 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1079.743537] env[67893]: DEBUG nova.compute.manager [req-1ffc67f0-0804-4aab-9dc7-8df71328a4b6 req-7c640cb0-2067-4704-9426-0aa6a6854b90 service nova] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Refreshing instance network info cache due to event network-changed-03eb9822-1e5f-4573-be43-4782c7879409. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1079.743739] env[67893]: DEBUG oslo_concurrency.lockutils [req-1ffc67f0-0804-4aab-9dc7-8df71328a4b6 req-7c640cb0-2067-4704-9426-0aa6a6854b90 service nova] Acquiring lock "refresh_cache-5a24adaf-bced-4488-9ccb-fc996b2ba154" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1079.743865] env[67893]: DEBUG oslo_concurrency.lockutils [req-1ffc67f0-0804-4aab-9dc7-8df71328a4b6 req-7c640cb0-2067-4704-9426-0aa6a6854b90 service nova] Acquired lock "refresh_cache-5a24adaf-bced-4488-9ccb-fc996b2ba154" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1079.744041] env[67893]: DEBUG nova.network.neutron [req-1ffc67f0-0804-4aab-9dc7-8df71328a4b6 req-7c640cb0-2067-4704-9426-0aa6a6854b90 service nova] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Refreshing network info cache for port 03eb9822-1e5f-4573-be43-4782c7879409 {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1080.097834] env[67893]: DEBUG nova.network.neutron [req-1ffc67f0-0804-4aab-9dc7-8df71328a4b6 req-7c640cb0-2067-4704-9426-0aa6a6854b90 service nova] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Updated VIF entry in instance network info cache for port 03eb9822-1e5f-4573-be43-4782c7879409. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1080.098210] env[67893]: DEBUG nova.network.neutron [req-1ffc67f0-0804-4aab-9dc7-8df71328a4b6 req-7c640cb0-2067-4704-9426-0aa6a6854b90 service nova] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Updating instance_info_cache with network_info: [{"id": "03eb9822-1e5f-4573-be43-4782c7879409", "address": "fa:16:3e:47:c6:19", "network": {"id": "84dc64d2-5c98-4c6f-a1ca-c3e9ccbd04c0", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1664609498-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a8907299f165470d94c58acc601fd86f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d7836a5b-a91e-4d3f-8e96-afe024f62bb5", "external-id": "nsx-vlan-transportzone-419", "segmentation_id": 419, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap03eb9822-1e", "ovs_interfaceid": "03eb9822-1e5f-4573-be43-4782c7879409", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1080.113574] env[67893]: DEBUG oslo_concurrency.lockutils [req-1ffc67f0-0804-4aab-9dc7-8df71328a4b6 req-7c640cb0-2067-4704-9426-0aa6a6854b90 service nova] Releasing lock "refresh_cache-5a24adaf-bced-4488-9ccb-fc996b2ba154" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1080.338287] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Acquiring lock "8dbbc2e6-9993-4bf0-b66b-6e685789221c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1080.338574] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Lock "8dbbc2e6-9993-4bf0-b66b-6e685789221c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1084.697028] env[67893]: DEBUG oslo_concurrency.lockutils [None req-99a775e0-5cfa-4dd7-8504-164edc52d18a tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Acquiring lock "783b7968-c130-47f5-9ad3-459d0e7eb746" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1084.697028] env[67893]: DEBUG oslo_concurrency.lockutils [None req-99a775e0-5cfa-4dd7-8504-164edc52d18a tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Lock "783b7968-c130-47f5-9ad3-459d0e7eb746" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1092.323893] env[67893]: DEBUG oslo_concurrency.lockutils [None req-27231dd0-db90-480e-a5f8-adcc2d483328 tempest-VolumesAdminNegativeTest-1794643428 tempest-VolumesAdminNegativeTest-1794643428-project-member] Acquiring lock "1591ce78-4293-4d03-be3f-a2cb552f51f7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1092.324598] env[67893]: DEBUG oslo_concurrency.lockutils [None req-27231dd0-db90-480e-a5f8-adcc2d483328 tempest-VolumesAdminNegativeTest-1794643428 tempest-VolumesAdminNegativeTest-1794643428-project-member] Lock "1591ce78-4293-4d03-be3f-a2cb552f51f7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1094.310028] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5d0af375-942f-46af-ab55-f6c3b906a963 tempest-MultipleCreateTestJSON-237684718 tempest-MultipleCreateTestJSON-237684718-project-member] Acquiring lock "46d7643f-00ab-4953-9a4c-e07b96615f2a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1094.310365] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5d0af375-942f-46af-ab55-f6c3b906a963 tempest-MultipleCreateTestJSON-237684718 tempest-MultipleCreateTestJSON-237684718-project-member] Lock "46d7643f-00ab-4953-9a4c-e07b96615f2a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1094.345808] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5d0af375-942f-46af-ab55-f6c3b906a963 tempest-MultipleCreateTestJSON-237684718 tempest-MultipleCreateTestJSON-237684718-project-member] Acquiring lock "dd596db2-a53c-4609-a1da-6db1ec79846e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1094.345995] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5d0af375-942f-46af-ab55-f6c3b906a963 tempest-MultipleCreateTestJSON-237684718 tempest-MultipleCreateTestJSON-237684718-project-member] Lock "dd596db2-a53c-4609-a1da-6db1ec79846e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1095.869460] env[67893]: DEBUG oslo_concurrency.lockutils [None req-8bf17f9f-a8b2-446e-96a0-b0ad0d8e23c6 tempest-SecurityGroupsTestJSON-756338800 tempest-SecurityGroupsTestJSON-756338800-project-member] Acquiring lock "039d691f-31fe-4020-90aa-82905198e13d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1095.869769] env[67893]: DEBUG oslo_concurrency.lockutils [None req-8bf17f9f-a8b2-446e-96a0-b0ad0d8e23c6 tempest-SecurityGroupsTestJSON-756338800 tempest-SecurityGroupsTestJSON-756338800-project-member] Lock "039d691f-31fe-4020-90aa-82905198e13d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1095.873071] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4549290e-89f7-4fe1-8a84-5564c0e9a898 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquiring lock "f7bcf0fe-9569-4b61-be9e-c29f4116cb11" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1095.873282] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4549290e-89f7-4fe1-8a84-5564c0e9a898 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Lock "f7bcf0fe-9569-4b61-be9e-c29f4116cb11" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1099.155749] env[67893]: DEBUG oslo_concurrency.lockutils [None req-872de797-89e9-4dc7-8428-9fa09a0d0f94 tempest-FloatingIPsAssociationNegativeTestJSON-1059429559 tempest-FloatingIPsAssociationNegativeTestJSON-1059429559-project-member] Acquiring lock "0363316a-cf39-4741-baa9-a040d7486df2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1099.156053] env[67893]: DEBUG oslo_concurrency.lockutils [None req-872de797-89e9-4dc7-8428-9fa09a0d0f94 tempest-FloatingIPsAssociationNegativeTestJSON-1059429559 tempest-FloatingIPsAssociationNegativeTestJSON-1059429559-project-member] Lock "0363316a-cf39-4741-baa9-a040d7486df2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1099.686921] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4d7b3cdf-2c75-4414-a5f5-2ff9671ab70c tempest-ServerShowV257Test-908506870 tempest-ServerShowV257Test-908506870-project-member] Acquiring lock "9dce8f0a-8fbe-43a5-af0b-ab9f76055bef" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1099.687206] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4d7b3cdf-2c75-4414-a5f5-2ff9671ab70c tempest-ServerShowV257Test-908506870 tempest-ServerShowV257Test-908506870-project-member] Lock "9dce8f0a-8fbe-43a5-af0b-ab9f76055bef" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1121.620408] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1121.859433] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1122.353456] env[67893]: WARNING oslo_vmware.rw_handles [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1122.353456] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1122.353456] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1122.353456] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1122.353456] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1122.353456] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 1122.353456] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1122.353456] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1122.353456] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1122.353456] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1122.353456] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1122.353456] env[67893]: ERROR oslo_vmware.rw_handles [ 1122.353821] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/74cbe3f2-97b0-41d3-b1ae-e1ebfc9c8399/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1122.356155] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1122.356459] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Copying Virtual Disk [datastore1] vmware_temp/74cbe3f2-97b0-41d3-b1ae-e1ebfc9c8399/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/74cbe3f2-97b0-41d3-b1ae-e1ebfc9c8399/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1122.356801] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-5ced0cff-7179-400f-9746-1ffa16424043 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1122.365942] env[67893]: DEBUG oslo_vmware.api [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Waiting for the task: (returnval){ [ 1122.365942] env[67893]: value = "task-3455386" [ 1122.365942] env[67893]: _type = "Task" [ 1122.365942] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1122.374468] env[67893]: DEBUG oslo_vmware.api [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Task: {'id': task-3455386, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1122.859561] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1122.859826] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1122.859933] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1122.875676] env[67893]: DEBUG oslo_vmware.exceptions [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1122.875932] env[67893]: DEBUG oslo_concurrency.lockutils [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1122.876481] env[67893]: ERROR nova.compute.manager [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1122.876481] env[67893]: Faults: ['InvalidArgument'] [ 1122.876481] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Traceback (most recent call last): [ 1122.876481] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1122.876481] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] yield resources [ 1122.876481] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1122.876481] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] self.driver.spawn(context, instance, image_meta, [ 1122.876481] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1122.876481] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1122.876481] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1122.876481] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] self._fetch_image_if_missing(context, vi) [ 1122.876481] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1122.876481] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] image_cache(vi, tmp_image_ds_loc) [ 1122.876481] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1122.876481] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] vm_util.copy_virtual_disk( [ 1122.876481] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1122.876481] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] session._wait_for_task(vmdk_copy_task) [ 1122.876481] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1122.876481] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] return self.wait_for_task(task_ref) [ 1122.876481] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1122.876481] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] return evt.wait() [ 1122.876481] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1122.876481] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] result = hub.switch() [ 1122.876481] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1122.876481] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] return self.greenlet.switch() [ 1122.876481] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1122.876481] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] self.f(*self.args, **self.kw) [ 1122.876481] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1122.876481] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] raise exceptions.translate_fault(task_info.error) [ 1122.876481] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1122.876481] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Faults: ['InvalidArgument'] [ 1122.876481] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] [ 1122.877397] env[67893]: INFO nova.compute.manager [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Terminating instance [ 1122.878239] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1122.878438] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1122.879029] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-39876ee1-283a-4f4c-b705-6a32c54ec705 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1122.880730] env[67893]: DEBUG nova.compute.manager [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1122.880920] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1122.881622] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-869723e8-acaa-46d5-8c8e-236cafc347c5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1122.886829] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1122.886979] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1122.887128] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1122.887257] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1122.887379] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1122.887501] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1122.887620] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1122.887738] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1122.887854] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1122.887971] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1122.888104] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1122.890501] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1122.890675] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1122.893214] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e4cc4800-9cc1-4eb8-a074-989f6db853f6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1122.895461] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1122.895664] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4e1b8702-747a-4209-b334-7751b36d2f13 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1122.899279] env[67893]: DEBUG oslo_vmware.api [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Waiting for the task: (returnval){ [ 1122.899279] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52538f12-f9e8-1f2a-919e-5f5d8a3c79fa" [ 1122.899279] env[67893]: _type = "Task" [ 1122.899279] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1122.905802] env[67893]: DEBUG oslo_vmware.api [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52538f12-f9e8-1f2a-919e-5f5d8a3c79fa, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1123.292685] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1123.292949] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1123.293288] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Deleting the datastore file [datastore1] 6520080a-8bf1-4803-9099-87c3ba6e28e4 {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1123.293576] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4bc4ae0f-5b39-43eb-8e09-26f58e8b89c7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1123.300340] env[67893]: DEBUG oslo_vmware.api [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Waiting for the task: (returnval){ [ 1123.300340] env[67893]: value = "task-3455388" [ 1123.300340] env[67893]: _type = "Task" [ 1123.300340] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1123.308289] env[67893]: DEBUG oslo_vmware.api [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Task: {'id': task-3455388, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1123.408855] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1123.409155] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Creating directory with path [datastore1] vmware_temp/2fd69f99-cba0-4491-bb81-c2be72537c65/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1123.409461] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4bdec007-ee2f-4f01-ace2-4f4ceffa51b5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1123.420697] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Created directory with path [datastore1] vmware_temp/2fd69f99-cba0-4491-bb81-c2be72537c65/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1123.420890] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Fetch image to [datastore1] vmware_temp/2fd69f99-cba0-4491-bb81-c2be72537c65/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1123.421077] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/2fd69f99-cba0-4491-bb81-c2be72537c65/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1123.421821] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ee8da74-290d-4be2-bbc7-124d4da50a73 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1123.428447] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ef98967-6df5-47b0-9a69-e329d34708d0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1123.437316] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8934b810-c748-454f-a906-8ce1055a3347 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1123.468042] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41210ddd-e024-4bc3-80a9-34708fbb429c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1123.474312] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-9dde390f-f946-47f1-9618-7685fcd36f55 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1123.497476] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1123.547443] env[67893]: DEBUG oslo_vmware.rw_handles [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2fd69f99-cba0-4491-bb81-c2be72537c65/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1123.607861] env[67893]: DEBUG oslo_vmware.rw_handles [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1123.608073] env[67893]: DEBUG oslo_vmware.rw_handles [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2fd69f99-cba0-4491-bb81-c2be72537c65/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1123.812245] env[67893]: DEBUG oslo_vmware.api [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Task: {'id': task-3455388, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.084004} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1123.812245] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1123.812245] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1123.812245] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1123.812245] env[67893]: INFO nova.compute.manager [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Took 0.93 seconds to destroy the instance on the hypervisor. [ 1123.814129] env[67893]: DEBUG nova.compute.claims [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1123.814312] env[67893]: DEBUG oslo_concurrency.lockutils [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1123.814524] env[67893]: DEBUG oslo_concurrency.lockutils [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1124.123430] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-610ab763-2720-4e66-a48d-e8ca47447b6d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1124.132386] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2f16f79-8e3c-4912-8090-5d72da2b02e7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1124.161225] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e404a59-3c03-4bb9-8e2a-2b890c5d1574 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1124.168497] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2ccce29-6881-43c1-ae8e-ac7ce518a578 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1124.181525] env[67893]: DEBUG nova.compute.provider_tree [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1124.189911] env[67893]: DEBUG nova.scheduler.client.report [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1124.205674] env[67893]: DEBUG oslo_concurrency.lockutils [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.391s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1124.206230] env[67893]: ERROR nova.compute.manager [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1124.206230] env[67893]: Faults: ['InvalidArgument'] [ 1124.206230] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Traceback (most recent call last): [ 1124.206230] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1124.206230] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] self.driver.spawn(context, instance, image_meta, [ 1124.206230] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1124.206230] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1124.206230] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1124.206230] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] self._fetch_image_if_missing(context, vi) [ 1124.206230] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1124.206230] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] image_cache(vi, tmp_image_ds_loc) [ 1124.206230] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1124.206230] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] vm_util.copy_virtual_disk( [ 1124.206230] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1124.206230] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] session._wait_for_task(vmdk_copy_task) [ 1124.206230] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1124.206230] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] return self.wait_for_task(task_ref) [ 1124.206230] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1124.206230] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] return evt.wait() [ 1124.206230] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1124.206230] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] result = hub.switch() [ 1124.206230] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1124.206230] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] return self.greenlet.switch() [ 1124.206230] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1124.206230] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] self.f(*self.args, **self.kw) [ 1124.206230] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1124.206230] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] raise exceptions.translate_fault(task_info.error) [ 1124.206230] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1124.206230] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Faults: ['InvalidArgument'] [ 1124.206230] env[67893]: ERROR nova.compute.manager [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] [ 1124.207065] env[67893]: DEBUG nova.compute.utils [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1124.208321] env[67893]: DEBUG nova.compute.manager [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Build of instance 6520080a-8bf1-4803-9099-87c3ba6e28e4 was re-scheduled: A specified parameter was not correct: fileType [ 1124.208321] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1124.208686] env[67893]: DEBUG nova.compute.manager [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1124.208857] env[67893]: DEBUG nova.compute.manager [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1124.209039] env[67893]: DEBUG nova.compute.manager [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1124.209210] env[67893]: DEBUG nova.network.neutron [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1124.550530] env[67893]: DEBUG nova.network.neutron [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1124.558846] env[67893]: INFO nova.compute.manager [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Took 0.35 seconds to deallocate network for instance. [ 1124.659408] env[67893]: INFO nova.scheduler.client.report [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Deleted allocations for instance 6520080a-8bf1-4803-9099-87c3ba6e28e4 [ 1124.692400] env[67893]: DEBUG oslo_concurrency.lockutils [None req-de6f215a-a659-4b99-9a8e-bdf6aa011e88 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Lock "6520080a-8bf1-4803-9099-87c3ba6e28e4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 473.329s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1124.692400] env[67893]: DEBUG oslo_concurrency.lockutils [None req-131384d3-fdb9-401c-ab57-79d17eaa2a94 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Lock "6520080a-8bf1-4803-9099-87c3ba6e28e4" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 274.037s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1124.692400] env[67893]: DEBUG oslo_concurrency.lockutils [None req-131384d3-fdb9-401c-ab57-79d17eaa2a94 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Acquiring lock "6520080a-8bf1-4803-9099-87c3ba6e28e4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1124.692672] env[67893]: DEBUG oslo_concurrency.lockutils [None req-131384d3-fdb9-401c-ab57-79d17eaa2a94 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Lock "6520080a-8bf1-4803-9099-87c3ba6e28e4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1124.692728] env[67893]: DEBUG oslo_concurrency.lockutils [None req-131384d3-fdb9-401c-ab57-79d17eaa2a94 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Lock "6520080a-8bf1-4803-9099-87c3ba6e28e4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1124.694720] env[67893]: INFO nova.compute.manager [None req-131384d3-fdb9-401c-ab57-79d17eaa2a94 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Terminating instance [ 1124.696764] env[67893]: DEBUG nova.compute.manager [None req-131384d3-fdb9-401c-ab57-79d17eaa2a94 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1124.696934] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-131384d3-fdb9-401c-ab57-79d17eaa2a94 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1124.697466] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-17d83616-d5fa-4683-9e84-1c4b1ddb8fed {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1124.704912] env[67893]: DEBUG nova.compute.manager [None req-b8c1a89c-287f-4e8d-bca0-ac6bb96df018 tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: d8f75420-059d-4af1-8545-b5c4f67f4fe3] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1124.711155] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed2fbb17-97a6-4b9c-866a-2ed81d9ad404 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1124.732077] env[67893]: DEBUG nova.compute.manager [None req-b8c1a89c-287f-4e8d-bca0-ac6bb96df018 tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: d8f75420-059d-4af1-8545-b5c4f67f4fe3] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1124.742173] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-131384d3-fdb9-401c-ab57-79d17eaa2a94 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 6520080a-8bf1-4803-9099-87c3ba6e28e4 could not be found. [ 1124.742378] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-131384d3-fdb9-401c-ab57-79d17eaa2a94 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1124.742552] env[67893]: INFO nova.compute.manager [None req-131384d3-fdb9-401c-ab57-79d17eaa2a94 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1124.742791] env[67893]: DEBUG oslo.service.loopingcall [None req-131384d3-fdb9-401c-ab57-79d17eaa2a94 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1124.743281] env[67893]: DEBUG nova.compute.manager [-] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1124.743386] env[67893]: DEBUG nova.network.neutron [-] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1124.756278] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b8c1a89c-287f-4e8d-bca0-ac6bb96df018 tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Lock "d8f75420-059d-4af1-8545-b5c4f67f4fe3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 244.116s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1124.767887] env[67893]: DEBUG nova.compute.manager [None req-4457f057-ffb5-458b-a900-c25a20a3c02a tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: a6771d3c-90ff-4403-9124-e74d74256db8] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1124.771912] env[67893]: DEBUG nova.network.neutron [-] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1124.780230] env[67893]: INFO nova.compute.manager [-] [instance: 6520080a-8bf1-4803-9099-87c3ba6e28e4] Took 0.04 seconds to deallocate network for instance. [ 1124.816734] env[67893]: DEBUG nova.compute.manager [None req-4457f057-ffb5-458b-a900-c25a20a3c02a tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: a6771d3c-90ff-4403-9124-e74d74256db8] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1124.840713] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4457f057-ffb5-458b-a900-c25a20a3c02a tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Lock "a6771d3c-90ff-4403-9124-e74d74256db8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 242.971s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1124.853435] env[67893]: DEBUG nova.compute.manager [None req-b0d74f6c-573e-418d-b9ce-7fdd3db9c6d3 tempest-ServerRescueTestJSON-870876670 tempest-ServerRescueTestJSON-870876670-project-member] [instance: 993c926b-bdc5-4f7e-992a-aac8c658ea6c] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1124.859028] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1124.860348] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1124.860348] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1124.860348] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1124.860348] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1124.877429] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1124.878389] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1124.878389] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1124.878389] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1124.879054] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9865faf3-d243-4a02-9d9f-09b7a78782a8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1124.887631] env[67893]: DEBUG nova.compute.manager [None req-b0d74f6c-573e-418d-b9ce-7fdd3db9c6d3 tempest-ServerRescueTestJSON-870876670 tempest-ServerRescueTestJSON-870876670-project-member] [instance: 993c926b-bdc5-4f7e-992a-aac8c658ea6c] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1124.891331] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10fd4fb5-d016-45f1-9afb-c2882faee5be {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1124.910787] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-208b5ab2-3a8e-4056-b9b8-a8c9590d8dcc {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1124.913697] env[67893]: DEBUG oslo_concurrency.lockutils [None req-131384d3-fdb9-401c-ab57-79d17eaa2a94 tempest-ServerMetadataNegativeTestJSON-1231201548 tempest-ServerMetadataNegativeTestJSON-1231201548-project-member] Lock "6520080a-8bf1-4803-9099-87c3ba6e28e4" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.222s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1124.919768] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29c7d5c6-f49f-46e4-b10b-6a00fd189565 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1124.923249] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b0d74f6c-573e-418d-b9ce-7fdd3db9c6d3 tempest-ServerRescueTestJSON-870876670 tempest-ServerRescueTestJSON-870876670-project-member] Lock "993c926b-bdc5-4f7e-992a-aac8c658ea6c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 238.672s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1124.953183] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180983MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1124.953354] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1124.953548] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1124.955309] env[67893]: DEBUG nova.compute.manager [None req-9ecb397f-b783-4c03-8a0e-2c17cb092063 tempest-AttachInterfacesV270Test-1389174808 tempest-AttachInterfacesV270Test-1389174808-project-member] [instance: 429fffb6-8355-419d-8cbb-a406d723802b] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1124.982555] env[67893]: DEBUG nova.compute.manager [None req-9ecb397f-b783-4c03-8a0e-2c17cb092063 tempest-AttachInterfacesV270Test-1389174808 tempest-AttachInterfacesV270Test-1389174808-project-member] [instance: 429fffb6-8355-419d-8cbb-a406d723802b] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1125.007222] env[67893]: DEBUG oslo_concurrency.lockutils [None req-9ecb397f-b783-4c03-8a0e-2c17cb092063 tempest-AttachInterfacesV270Test-1389174808 tempest-AttachInterfacesV270Test-1389174808-project-member] Lock "429fffb6-8355-419d-8cbb-a406d723802b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 236.838s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1125.017962] env[67893]: DEBUG nova.compute.manager [None req-b8b0949c-6630-4b49-83ca-41242a24d5d3 tempest-ServerActionsTestJSON-763823941 tempest-ServerActionsTestJSON-763823941-project-member] [instance: 8c9b4750-db4e-446b-b108-fc675c6f4c69] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1125.021616] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 19ab9782-9131-46ba-bbf2-cc021953046e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1125.021849] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2eb8d698-9436-4e91-bd10-5f5200415144 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1125.021905] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1125.022056] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1125.022211] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a9656a7e-8a7b-489e-9990-097c1e93e535 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1125.022332] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance fcae7119-6233-4a52-9e52-1147f2b10ddc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1125.022446] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2553f3c0-0988-4e11-a138-7e5f71e71f48 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1125.022558] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c05df6c1-e4c9-4276-9981-e80e584d540c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1125.022669] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5a24adaf-bced-4488-9ccb-fc996b2ba154 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1125.032557] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance efdb0a7e-403d-4de5-8c09-72b9c8f9cd79 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1125.046811] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5ede1991-efee-4c34-af5b-ce71f67456ef has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1125.048357] env[67893]: DEBUG nova.compute.manager [None req-b8b0949c-6630-4b49-83ca-41242a24d5d3 tempest-ServerActionsTestJSON-763823941 tempest-ServerActionsTestJSON-763823941-project-member] [instance: 8c9b4750-db4e-446b-b108-fc675c6f4c69] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1125.057611] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance b3d31ca3-9a7a-49d0-955f-1e12808bf11f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1125.067129] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 14f7c0cf-cbf0-4090-89a9-45fe4485cf31 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1125.068944] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b8b0949c-6630-4b49-83ca-41242a24d5d3 tempest-ServerActionsTestJSON-763823941 tempest-ServerActionsTestJSON-763823941-project-member] Lock "8c9b4750-db4e-446b-b108-fc675c6f4c69" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 236.237s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1125.077812] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance d41abc6b-6519-4994-aa17-6b6bd94c93d9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1125.078866] env[67893]: DEBUG nova.compute.manager [None req-688b9f69-20d0-48ac-bdd1-47e2e364bdd5 tempest-SecurityGroupsTestJSON-756338800 tempest-SecurityGroupsTestJSON-756338800-project-member] [instance: aa02f91b-b125-42af-be9a-c565ed041288] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1125.086940] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 42938110-2d23-432a-bdb2-30750dac90b4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1125.096178] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 8dbbc2e6-9993-4bf0-b66b-6e685789221c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1125.101558] env[67893]: DEBUG nova.compute.manager [None req-688b9f69-20d0-48ac-bdd1-47e2e364bdd5 tempest-SecurityGroupsTestJSON-756338800 tempest-SecurityGroupsTestJSON-756338800-project-member] [instance: aa02f91b-b125-42af-be9a-c565ed041288] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1125.106435] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 783b7968-c130-47f5-9ad3-459d0e7eb746 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1125.114804] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 1591ce78-4293-4d03-be3f-a2cb552f51f7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1125.121714] env[67893]: DEBUG oslo_concurrency.lockutils [None req-688b9f69-20d0-48ac-bdd1-47e2e364bdd5 tempest-SecurityGroupsTestJSON-756338800 tempest-SecurityGroupsTestJSON-756338800-project-member] Lock "aa02f91b-b125-42af-be9a-c565ed041288" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 231.778s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1125.124248] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 46d7643f-00ab-4953-9a4c-e07b96615f2a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1125.130978] env[67893]: DEBUG nova.compute.manager [None req-4dd29f0f-254b-4793-be89-17d9914c6169 tempest-VolumesAdminNegativeTest-1794643428 tempest-VolumesAdminNegativeTest-1794643428-project-member] [instance: e720d161-0c76-47ab-8d24-e465109d6e8c] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1125.133895] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance dd596db2-a53c-4609-a1da-6db1ec79846e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1125.142199] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 039d691f-31fe-4020-90aa-82905198e13d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1125.154327] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance f7bcf0fe-9569-4b61-be9e-c29f4116cb11 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1125.158176] env[67893]: DEBUG nova.compute.manager [None req-4dd29f0f-254b-4793-be89-17d9914c6169 tempest-VolumesAdminNegativeTest-1794643428 tempest-VolumesAdminNegativeTest-1794643428-project-member] [instance: e720d161-0c76-47ab-8d24-e465109d6e8c] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1125.164363] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 0363316a-cf39-4741-baa9-a040d7486df2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1125.173752] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 9dce8f0a-8fbe-43a5-af0b-ab9f76055bef has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1125.173978] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1125.174143] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1125.178379] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4dd29f0f-254b-4793-be89-17d9914c6169 tempest-VolumesAdminNegativeTest-1794643428 tempest-VolumesAdminNegativeTest-1794643428-project-member] Lock "e720d161-0c76-47ab-8d24-e465109d6e8c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 231.119s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1125.187675] env[67893]: DEBUG nova.compute.manager [None req-e53cb8c5-4a11-461d-8895-82fc1b757527 tempest-ServerDiagnosticsV248Test-879533068 tempest-ServerDiagnosticsV248Test-879533068-project-member] [instance: 5e7e68fb-f8a5-46c9-b0b1-9fbc96c82428] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1125.210894] env[67893]: DEBUG nova.compute.manager [None req-e53cb8c5-4a11-461d-8895-82fc1b757527 tempest-ServerDiagnosticsV248Test-879533068 tempest-ServerDiagnosticsV248Test-879533068-project-member] [instance: 5e7e68fb-f8a5-46c9-b0b1-9fbc96c82428] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1125.235499] env[67893]: DEBUG oslo_concurrency.lockutils [None req-e53cb8c5-4a11-461d-8895-82fc1b757527 tempest-ServerDiagnosticsV248Test-879533068 tempest-ServerDiagnosticsV248Test-879533068-project-member] Lock "5e7e68fb-f8a5-46c9-b0b1-9fbc96c82428" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 229.434s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1125.249615] env[67893]: DEBUG nova.compute.manager [None req-809f5967-6626-4000-83f7-7adfd49626a7 tempest-ListServerFiltersTestJSON-1036915369 tempest-ListServerFiltersTestJSON-1036915369-project-member] [instance: f27780c8-3155-480a-bb3c-e93cdac254f2] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1125.278040] env[67893]: DEBUG nova.compute.manager [None req-809f5967-6626-4000-83f7-7adfd49626a7 tempest-ListServerFiltersTestJSON-1036915369 tempest-ListServerFiltersTestJSON-1036915369-project-member] [instance: f27780c8-3155-480a-bb3c-e93cdac254f2] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1125.298618] env[67893]: DEBUG oslo_concurrency.lockutils [None req-809f5967-6626-4000-83f7-7adfd49626a7 tempest-ListServerFiltersTestJSON-1036915369 tempest-ListServerFiltersTestJSON-1036915369-project-member] Lock "f27780c8-3155-480a-bb3c-e93cdac254f2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 225.609s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1125.315354] env[67893]: DEBUG nova.compute.manager [None req-a7457600-f5bf-48a1-86e5-d08684aa55c5 tempest-ListServerFiltersTestJSON-1036915369 tempest-ListServerFiltersTestJSON-1036915369-project-member] [instance: 15515d0f-d317-4cc3-a922-c8a64654f4b2] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1125.347560] env[67893]: DEBUG nova.compute.manager [None req-a7457600-f5bf-48a1-86e5-d08684aa55c5 tempest-ListServerFiltersTestJSON-1036915369 tempest-ListServerFiltersTestJSON-1036915369-project-member] [instance: 15515d0f-d317-4cc3-a922-c8a64654f4b2] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1125.370532] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a7457600-f5bf-48a1-86e5-d08684aa55c5 tempest-ListServerFiltersTestJSON-1036915369 tempest-ListServerFiltersTestJSON-1036915369-project-member] Lock "15515d0f-d317-4cc3-a922-c8a64654f4b2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 225.210s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1125.382210] env[67893]: DEBUG nova.compute.manager [None req-268244dd-f80e-4d36-8668-e5378bec8848 tempest-ListServerFiltersTestJSON-1036915369 tempest-ListServerFiltersTestJSON-1036915369-project-member] [instance: c0e59ef6-c233-490f-ab69-ab198142590a] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1125.409920] env[67893]: DEBUG nova.compute.manager [None req-268244dd-f80e-4d36-8668-e5378bec8848 tempest-ListServerFiltersTestJSON-1036915369 tempest-ListServerFiltersTestJSON-1036915369-project-member] [instance: c0e59ef6-c233-490f-ab69-ab198142590a] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1125.433778] env[67893]: DEBUG oslo_concurrency.lockutils [None req-268244dd-f80e-4d36-8668-e5378bec8848 tempest-ListServerFiltersTestJSON-1036915369 tempest-ListServerFiltersTestJSON-1036915369-project-member] Lock "c0e59ef6-c233-490f-ab69-ab198142590a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 224.650s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1125.448825] env[67893]: DEBUG nova.compute.manager [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1125.491221] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63ce44dd-dd21-4a53-8eb3-8f90349f9071 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1125.494797] env[67893]: DEBUG oslo_concurrency.lockutils [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1125.498418] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5bd997c8-ac3e-435a-a0bb-0cb8cf47960c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1125.527219] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d847888-94e3-4188-a954-38dbf01e21f5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1125.533638] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c98573d-4287-4c98-a8f5-8798f28afe81 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1125.546749] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1125.554944] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1125.572913] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1125.572913] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.619s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1125.573136] env[67893]: DEBUG oslo_concurrency.lockutils [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.078s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1125.574562] env[67893]: INFO nova.compute.claims [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1125.907378] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57956429-7a34-49bf-a97f-45934ae82e61 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1125.915380] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4eef89f-4587-47b2-83f6-7d8d147ae5e2 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1125.944440] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14b8bed8-d2d3-4812-95c4-9267be66e843 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1125.951322] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40327d58-db94-41c5-b079-84d24fd7b668 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1125.964790] env[67893]: DEBUG nova.compute.provider_tree [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1125.973847] env[67893]: DEBUG nova.scheduler.client.report [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1125.992190] env[67893]: DEBUG oslo_concurrency.lockutils [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.419s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1125.992620] env[67893]: DEBUG nova.compute.manager [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1126.026141] env[67893]: DEBUG nova.compute.utils [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1126.027822] env[67893]: DEBUG nova.compute.manager [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1126.028059] env[67893]: DEBUG nova.network.neutron [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1126.036828] env[67893]: DEBUG nova.compute.manager [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1126.109115] env[67893]: DEBUG nova.policy [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6bda7ef00cbc43848c9e0e6da5e9ac48', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e426128d24b946ee8c9d3d1b6d62243c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 1126.120202] env[67893]: DEBUG nova.compute.manager [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1126.145255] env[67893]: DEBUG nova.virt.hardware [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1126.145497] env[67893]: DEBUG nova.virt.hardware [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1126.145970] env[67893]: DEBUG nova.virt.hardware [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1126.146285] env[67893]: DEBUG nova.virt.hardware [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1126.146463] env[67893]: DEBUG nova.virt.hardware [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1126.146648] env[67893]: DEBUG nova.virt.hardware [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1126.146891] env[67893]: DEBUG nova.virt.hardware [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1126.147104] env[67893]: DEBUG nova.virt.hardware [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1126.147325] env[67893]: DEBUG nova.virt.hardware [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1126.147543] env[67893]: DEBUG nova.virt.hardware [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1126.147759] env[67893]: DEBUG nova.virt.hardware [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1126.148716] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41c49d09-b06f-4d55-b59c-c3ad4a7474f7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1126.156812] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91ed74f1-25f1-4431-b112-5dd1fd6b6713 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1126.446173] env[67893]: DEBUG nova.network.neutron [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Successfully created port: cc583b19-ca24-4338-88cc-41797d8bbf31 {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1126.571578] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1126.571827] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1127.126700] env[67893]: DEBUG nova.compute.manager [req-aee9370e-2ba7-464b-8325-432b421df0e4 req-a7204f7c-89b0-46ce-b2ae-f8b2d7c80385 service nova] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Received event network-vif-plugged-cc583b19-ca24-4338-88cc-41797d8bbf31 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1127.126919] env[67893]: DEBUG oslo_concurrency.lockutils [req-aee9370e-2ba7-464b-8325-432b421df0e4 req-a7204f7c-89b0-46ce-b2ae-f8b2d7c80385 service nova] Acquiring lock "efdb0a7e-403d-4de5-8c09-72b9c8f9cd79-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1127.127145] env[67893]: DEBUG oslo_concurrency.lockutils [req-aee9370e-2ba7-464b-8325-432b421df0e4 req-a7204f7c-89b0-46ce-b2ae-f8b2d7c80385 service nova] Lock "efdb0a7e-403d-4de5-8c09-72b9c8f9cd79-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1127.127316] env[67893]: DEBUG oslo_concurrency.lockutils [req-aee9370e-2ba7-464b-8325-432b421df0e4 req-a7204f7c-89b0-46ce-b2ae-f8b2d7c80385 service nova] Lock "efdb0a7e-403d-4de5-8c09-72b9c8f9cd79-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1127.127484] env[67893]: DEBUG nova.compute.manager [req-aee9370e-2ba7-464b-8325-432b421df0e4 req-a7204f7c-89b0-46ce-b2ae-f8b2d7c80385 service nova] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] No waiting events found dispatching network-vif-plugged-cc583b19-ca24-4338-88cc-41797d8bbf31 {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1127.127648] env[67893]: WARNING nova.compute.manager [req-aee9370e-2ba7-464b-8325-432b421df0e4 req-a7204f7c-89b0-46ce-b2ae-f8b2d7c80385 service nova] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Received unexpected event network-vif-plugged-cc583b19-ca24-4338-88cc-41797d8bbf31 for instance with vm_state building and task_state spawning. [ 1127.236451] env[67893]: DEBUG nova.network.neutron [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Successfully updated port: cc583b19-ca24-4338-88cc-41797d8bbf31 {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1127.253611] env[67893]: DEBUG oslo_concurrency.lockutils [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Acquiring lock "refresh_cache-efdb0a7e-403d-4de5-8c09-72b9c8f9cd79" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1127.253756] env[67893]: DEBUG oslo_concurrency.lockutils [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Acquired lock "refresh_cache-efdb0a7e-403d-4de5-8c09-72b9c8f9cd79" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1127.253903] env[67893]: DEBUG nova.network.neutron [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1127.488252] env[67893]: DEBUG nova.network.neutron [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1127.685918] env[67893]: DEBUG nova.network.neutron [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Updating instance_info_cache with network_info: [{"id": "cc583b19-ca24-4338-88cc-41797d8bbf31", "address": "fa:16:3e:7b:77:1e", "network": {"id": "910874f3-69c1-4d62-9101-dd1fb0c648c0", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1803764421-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e426128d24b946ee8c9d3d1b6d62243c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b49c5024-2ced-42ca-90cc-6066766d43e6", "external-id": "nsx-vlan-transportzone-239", "segmentation_id": 239, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcc583b19-ca", "ovs_interfaceid": "cc583b19-ca24-4338-88cc-41797d8bbf31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1127.700293] env[67893]: DEBUG oslo_concurrency.lockutils [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Releasing lock "refresh_cache-efdb0a7e-403d-4de5-8c09-72b9c8f9cd79" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1127.700293] env[67893]: DEBUG nova.compute.manager [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Instance network_info: |[{"id": "cc583b19-ca24-4338-88cc-41797d8bbf31", "address": "fa:16:3e:7b:77:1e", "network": {"id": "910874f3-69c1-4d62-9101-dd1fb0c648c0", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1803764421-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e426128d24b946ee8c9d3d1b6d62243c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b49c5024-2ced-42ca-90cc-6066766d43e6", "external-id": "nsx-vlan-transportzone-239", "segmentation_id": 239, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcc583b19-ca", "ovs_interfaceid": "cc583b19-ca24-4338-88cc-41797d8bbf31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1127.700293] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:7b:77:1e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'b49c5024-2ced-42ca-90cc-6066766d43e6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'cc583b19-ca24-4338-88cc-41797d8bbf31', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1127.707996] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Creating folder: Project (e426128d24b946ee8c9d3d1b6d62243c). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1127.708097] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-83f624c7-a3e5-4bb0-88dd-055a9a4e5c35 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1127.721026] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Created folder: Project (e426128d24b946ee8c9d3d1b6d62243c) in parent group-v689771. [ 1127.721026] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Creating folder: Instances. Parent ref: group-v689834. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1127.721164] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9591a5da-87d5-46d4-b8a4-9d616fab5183 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1127.730659] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Created folder: Instances in parent group-v689834. [ 1127.730659] env[67893]: DEBUG oslo.service.loopingcall [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1127.730743] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1127.730942] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e5e49d99-9d6f-4e35-8081-3bf9858aaadb {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1127.750012] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1127.750012] env[67893]: value = "task-3455391" [ 1127.750012] env[67893]: _type = "Task" [ 1127.750012] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1127.757220] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455391, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1128.259952] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455391, 'name': CreateVM_Task, 'duration_secs': 0.304287} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1128.260205] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1128.260872] env[67893]: DEBUG oslo_concurrency.lockutils [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1128.261085] env[67893]: DEBUG oslo_concurrency.lockutils [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1128.261437] env[67893]: DEBUG oslo_concurrency.lockutils [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1128.261714] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-51544f47-20ac-4381-9965-f87b804b5545 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1128.266125] env[67893]: DEBUG oslo_vmware.api [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Waiting for the task: (returnval){ [ 1128.266125] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52b27617-c4e7-e56c-e2d7-ef30f87b5d0e" [ 1128.266125] env[67893]: _type = "Task" [ 1128.266125] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1128.273436] env[67893]: DEBUG oslo_vmware.api [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52b27617-c4e7-e56c-e2d7-ef30f87b5d0e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1128.776720] env[67893]: DEBUG oslo_concurrency.lockutils [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1128.776940] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1128.777178] env[67893]: DEBUG oslo_concurrency.lockutils [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1129.158174] env[67893]: DEBUG nova.compute.manager [req-6aec0910-261c-4753-9247-8bf4f30007e7 req-573132b8-4dca-431c-8c79-21329452eb30 service nova] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Received event network-changed-cc583b19-ca24-4338-88cc-41797d8bbf31 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1129.158378] env[67893]: DEBUG nova.compute.manager [req-6aec0910-261c-4753-9247-8bf4f30007e7 req-573132b8-4dca-431c-8c79-21329452eb30 service nova] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Refreshing instance network info cache due to event network-changed-cc583b19-ca24-4338-88cc-41797d8bbf31. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1129.158589] env[67893]: DEBUG oslo_concurrency.lockutils [req-6aec0910-261c-4753-9247-8bf4f30007e7 req-573132b8-4dca-431c-8c79-21329452eb30 service nova] Acquiring lock "refresh_cache-efdb0a7e-403d-4de5-8c09-72b9c8f9cd79" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1129.158729] env[67893]: DEBUG oslo_concurrency.lockutils [req-6aec0910-261c-4753-9247-8bf4f30007e7 req-573132b8-4dca-431c-8c79-21329452eb30 service nova] Acquired lock "refresh_cache-efdb0a7e-403d-4de5-8c09-72b9c8f9cd79" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1129.158886] env[67893]: DEBUG nova.network.neutron [req-6aec0910-261c-4753-9247-8bf4f30007e7 req-573132b8-4dca-431c-8c79-21329452eb30 service nova] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Refreshing network info cache for port cc583b19-ca24-4338-88cc-41797d8bbf31 {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1129.491433] env[67893]: DEBUG nova.network.neutron [req-6aec0910-261c-4753-9247-8bf4f30007e7 req-573132b8-4dca-431c-8c79-21329452eb30 service nova] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Updated VIF entry in instance network info cache for port cc583b19-ca24-4338-88cc-41797d8bbf31. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1129.491801] env[67893]: DEBUG nova.network.neutron [req-6aec0910-261c-4753-9247-8bf4f30007e7 req-573132b8-4dca-431c-8c79-21329452eb30 service nova] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Updating instance_info_cache with network_info: [{"id": "cc583b19-ca24-4338-88cc-41797d8bbf31", "address": "fa:16:3e:7b:77:1e", "network": {"id": "910874f3-69c1-4d62-9101-dd1fb0c648c0", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1803764421-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e426128d24b946ee8c9d3d1b6d62243c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b49c5024-2ced-42ca-90cc-6066766d43e6", "external-id": "nsx-vlan-transportzone-239", "segmentation_id": 239, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcc583b19-ca", "ovs_interfaceid": "cc583b19-ca24-4338-88cc-41797d8bbf31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1129.502986] env[67893]: DEBUG oslo_concurrency.lockutils [req-6aec0910-261c-4753-9247-8bf4f30007e7 req-573132b8-4dca-431c-8c79-21329452eb30 service nova] Releasing lock "refresh_cache-efdb0a7e-403d-4de5-8c09-72b9c8f9cd79" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1135.225816] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d1843f30-62a7-4738-b1fa-eb5b96f0b761 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Acquiring lock "efdb0a7e-403d-4de5-8c09-72b9c8f9cd79" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1172.372541] env[67893]: WARNING oslo_vmware.rw_handles [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1172.372541] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1172.372541] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1172.372541] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1172.372541] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1172.372541] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 1172.372541] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1172.372541] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1172.372541] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1172.372541] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1172.372541] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1172.372541] env[67893]: ERROR oslo_vmware.rw_handles [ 1172.373191] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/2fd69f99-cba0-4491-bb81-c2be72537c65/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1172.375120] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1172.375380] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Copying Virtual Disk [datastore1] vmware_temp/2fd69f99-cba0-4491-bb81-c2be72537c65/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/2fd69f99-cba0-4491-bb81-c2be72537c65/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1172.375692] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ba10c71f-f325-4974-830e-243d1a4076b1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1172.383959] env[67893]: DEBUG oslo_vmware.api [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Waiting for the task: (returnval){ [ 1172.383959] env[67893]: value = "task-3455392" [ 1172.383959] env[67893]: _type = "Task" [ 1172.383959] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1172.391227] env[67893]: DEBUG oslo_vmware.api [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Task: {'id': task-3455392, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1172.894304] env[67893]: DEBUG oslo_vmware.exceptions [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1172.894691] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1172.895246] env[67893]: ERROR nova.compute.manager [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1172.895246] env[67893]: Faults: ['InvalidArgument'] [ 1172.895246] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Traceback (most recent call last): [ 1172.895246] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1172.895246] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] yield resources [ 1172.895246] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1172.895246] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] self.driver.spawn(context, instance, image_meta, [ 1172.895246] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1172.895246] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1172.895246] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1172.895246] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] self._fetch_image_if_missing(context, vi) [ 1172.895246] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1172.895246] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] image_cache(vi, tmp_image_ds_loc) [ 1172.895246] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1172.895246] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] vm_util.copy_virtual_disk( [ 1172.895246] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1172.895246] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] session._wait_for_task(vmdk_copy_task) [ 1172.895246] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1172.895246] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] return self.wait_for_task(task_ref) [ 1172.895246] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1172.895246] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] return evt.wait() [ 1172.895246] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1172.895246] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] result = hub.switch() [ 1172.895246] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1172.895246] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] return self.greenlet.switch() [ 1172.895246] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1172.895246] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] self.f(*self.args, **self.kw) [ 1172.895246] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1172.895246] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] raise exceptions.translate_fault(task_info.error) [ 1172.895246] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1172.895246] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Faults: ['InvalidArgument'] [ 1172.895246] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] [ 1172.896389] env[67893]: INFO nova.compute.manager [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Terminating instance [ 1172.897149] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1172.897358] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1172.897594] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d15f3b27-7513-4fea-8b3f-29d08587fb38 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1172.899818] env[67893]: DEBUG nova.compute.manager [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1172.900205] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1172.900695] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a281856-3ef7-4a22-a698-ea98cd9ed97e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1172.907279] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1172.907489] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-bef7714e-eaae-4a5b-8c9e-afcb362795dc {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1172.909665] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1172.909833] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1172.910759] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1753791e-28dc-49ff-b175-2deb5e55014f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1172.915230] env[67893]: DEBUG oslo_vmware.api [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Waiting for the task: (returnval){ [ 1172.915230] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]5246d982-0616-55be-e6d1-eecb1d19d285" [ 1172.915230] env[67893]: _type = "Task" [ 1172.915230] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1172.928710] env[67893]: DEBUG oslo_vmware.api [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]5246d982-0616-55be-e6d1-eecb1d19d285, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1172.977551] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1172.977551] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1172.977721] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Deleting the datastore file [datastore1] 19ab9782-9131-46ba-bbf2-cc021953046e {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1172.977925] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e7f225b1-ad78-4635-a9d8-0c80e5886804 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1172.984659] env[67893]: DEBUG oslo_vmware.api [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Waiting for the task: (returnval){ [ 1172.984659] env[67893]: value = "task-3455394" [ 1172.984659] env[67893]: _type = "Task" [ 1172.984659] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1172.992662] env[67893]: DEBUG oslo_vmware.api [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Task: {'id': task-3455394, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1173.427020] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1173.427020] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Creating directory with path [datastore1] vmware_temp/8218b43b-57af-45c6-ab38-5dae24625ca7/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1173.427020] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0e73b02b-967e-41ca-9002-88a8119744d3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1173.441322] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Created directory with path [datastore1] vmware_temp/8218b43b-57af-45c6-ab38-5dae24625ca7/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1173.441507] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Fetch image to [datastore1] vmware_temp/8218b43b-57af-45c6-ab38-5dae24625ca7/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1173.441679] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/8218b43b-57af-45c6-ab38-5dae24625ca7/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1173.442493] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fecb8f5-85de-434c-8e51-d3521f81d806 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1173.449589] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f60ceb9f-0010-4ba6-a190-5a9035760216 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1173.459602] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61cd28f1-1760-4fcd-8b98-7d1199129bcf {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1173.493723] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a35fb67-f38a-4011-b482-ed6c8713073b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1173.501505] env[67893]: DEBUG oslo_vmware.api [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Task: {'id': task-3455394, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.078484} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1173.503210] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1173.503441] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1173.503623] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1173.503800] env[67893]: INFO nova.compute.manager [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1173.506320] env[67893]: DEBUG nova.compute.claims [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1173.506320] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1173.506498] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1173.508959] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-d170465a-d901-4b46-b935-7de03b137f89 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1173.533071] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1173.598304] env[67893]: DEBUG oslo_vmware.rw_handles [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8218b43b-57af-45c6-ab38-5dae24625ca7/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1173.683472] env[67893]: DEBUG oslo_vmware.rw_handles [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1173.683674] env[67893]: DEBUG oslo_vmware.rw_handles [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8218b43b-57af-45c6-ab38-5dae24625ca7/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1174.037044] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1197ae2e-780f-4cde-9aed-8218c2d52a0a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1174.047057] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c99403b-e5c9-4de4-a2b4-5e27f7bffce8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1174.088316] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b09b4511-cf34-436f-a62d-1dfe1063c6be {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1174.100571] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e3f20b3-7d17-4446-85e5-5eea868f2f66 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1174.123506] env[67893]: DEBUG nova.compute.provider_tree [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1174.140645] env[67893]: DEBUG nova.scheduler.client.report [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1174.165049] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.658s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1174.165049] env[67893]: ERROR nova.compute.manager [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1174.165049] env[67893]: Faults: ['InvalidArgument'] [ 1174.165049] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Traceback (most recent call last): [ 1174.165049] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1174.165049] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] self.driver.spawn(context, instance, image_meta, [ 1174.165049] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1174.165049] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1174.165049] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1174.165049] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] self._fetch_image_if_missing(context, vi) [ 1174.165049] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1174.165049] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] image_cache(vi, tmp_image_ds_loc) [ 1174.165049] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1174.165049] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] vm_util.copy_virtual_disk( [ 1174.165049] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1174.165049] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] session._wait_for_task(vmdk_copy_task) [ 1174.165049] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1174.165049] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] return self.wait_for_task(task_ref) [ 1174.165049] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1174.165049] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] return evt.wait() [ 1174.165049] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1174.165049] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] result = hub.switch() [ 1174.165049] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1174.165049] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] return self.greenlet.switch() [ 1174.165049] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1174.165049] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] self.f(*self.args, **self.kw) [ 1174.165049] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1174.165049] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] raise exceptions.translate_fault(task_info.error) [ 1174.165049] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1174.165049] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Faults: ['InvalidArgument'] [ 1174.165049] env[67893]: ERROR nova.compute.manager [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] [ 1174.166180] env[67893]: DEBUG nova.compute.utils [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1174.167321] env[67893]: DEBUG nova.compute.manager [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Build of instance 19ab9782-9131-46ba-bbf2-cc021953046e was re-scheduled: A specified parameter was not correct: fileType [ 1174.167321] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1174.167614] env[67893]: DEBUG nova.compute.manager [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1174.167789] env[67893]: DEBUG nova.compute.manager [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1174.167938] env[67893]: DEBUG nova.compute.manager [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1174.168110] env[67893]: DEBUG nova.network.neutron [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1174.798327] env[67893]: DEBUG nova.network.neutron [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1174.813937] env[67893]: INFO nova.compute.manager [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Took 0.65 seconds to deallocate network for instance. [ 1174.921273] env[67893]: INFO nova.scheduler.client.report [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Deleted allocations for instance 19ab9782-9131-46ba-bbf2-cc021953046e [ 1174.943342] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f2fc8041-c4c4-4ecd-8fd5-d5e1dd9cad03 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Lock "19ab9782-9131-46ba-bbf2-cc021953046e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 520.062s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1174.944538] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c945fa0c-71c6-49d1-80f9-8ae90c0c7469 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Lock "19ab9782-9131-46ba-bbf2-cc021953046e" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 122.222s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1174.944753] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c945fa0c-71c6-49d1-80f9-8ae90c0c7469 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Acquiring lock "19ab9782-9131-46ba-bbf2-cc021953046e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1174.944953] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c945fa0c-71c6-49d1-80f9-8ae90c0c7469 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Lock "19ab9782-9131-46ba-bbf2-cc021953046e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1174.945134] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c945fa0c-71c6-49d1-80f9-8ae90c0c7469 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Lock "19ab9782-9131-46ba-bbf2-cc021953046e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1174.947075] env[67893]: INFO nova.compute.manager [None req-c945fa0c-71c6-49d1-80f9-8ae90c0c7469 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Terminating instance [ 1174.949607] env[67893]: DEBUG nova.compute.manager [None req-c945fa0c-71c6-49d1-80f9-8ae90c0c7469 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1174.949607] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-c945fa0c-71c6-49d1-80f9-8ae90c0c7469 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1174.949771] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-59f19122-a71a-45a1-a744-a95c5e3c8c63 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1174.958924] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7df7db83-ef13-4330-a583-3fcbfcc47062 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1174.970238] env[67893]: DEBUG nova.compute.manager [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1174.990080] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-c945fa0c-71c6-49d1-80f9-8ae90c0c7469 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 19ab9782-9131-46ba-bbf2-cc021953046e could not be found. [ 1174.990281] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-c945fa0c-71c6-49d1-80f9-8ae90c0c7469 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1174.990458] env[67893]: INFO nova.compute.manager [None req-c945fa0c-71c6-49d1-80f9-8ae90c0c7469 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1174.990695] env[67893]: DEBUG oslo.service.loopingcall [None req-c945fa0c-71c6-49d1-80f9-8ae90c0c7469 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1174.990929] env[67893]: DEBUG nova.compute.manager [-] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1174.990999] env[67893]: DEBUG nova.network.neutron [-] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1175.015572] env[67893]: DEBUG nova.network.neutron [-] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1175.018267] env[67893]: DEBUG oslo_concurrency.lockutils [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1175.018497] env[67893]: DEBUG oslo_concurrency.lockutils [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1175.019866] env[67893]: INFO nova.compute.claims [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1175.023821] env[67893]: INFO nova.compute.manager [-] [instance: 19ab9782-9131-46ba-bbf2-cc021953046e] Took 0.03 seconds to deallocate network for instance. [ 1175.109137] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c945fa0c-71c6-49d1-80f9-8ae90c0c7469 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Lock "19ab9782-9131-46ba-bbf2-cc021953046e" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.165s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1175.366351] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-595a4a8b-34f3-430c-a004-daa2956826e2 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1175.374140] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-858840c1-cfae-4782-b6a5-d9fa3c11ccdb {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1175.403481] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fba7e2bd-6b45-49d0-8362-7935bca3d736 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1175.410499] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2681d4e-206d-4b18-881e-f2ea683cc000 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1175.423420] env[67893]: DEBUG nova.compute.provider_tree [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1175.432286] env[67893]: DEBUG nova.scheduler.client.report [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1175.448096] env[67893]: DEBUG oslo_concurrency.lockutils [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.429s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1175.448562] env[67893]: DEBUG nova.compute.manager [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1175.482085] env[67893]: DEBUG nova.compute.utils [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1175.483387] env[67893]: DEBUG nova.compute.manager [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1175.483555] env[67893]: DEBUG nova.network.neutron [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1175.492477] env[67893]: DEBUG nova.compute.manager [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1175.546691] env[67893]: DEBUG nova.policy [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9115f73c22bf4b0e9e5439363832061d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7a19d9bde3814325847c06cec1af09b7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 1175.555550] env[67893]: DEBUG nova.compute.manager [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1175.583473] env[67893]: DEBUG nova.virt.hardware [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1175.583776] env[67893]: DEBUG nova.virt.hardware [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1175.583988] env[67893]: DEBUG nova.virt.hardware [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1175.584259] env[67893]: DEBUG nova.virt.hardware [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1175.584701] env[67893]: DEBUG nova.virt.hardware [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1175.584701] env[67893]: DEBUG nova.virt.hardware [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1175.584871] env[67893]: DEBUG nova.virt.hardware [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1175.585085] env[67893]: DEBUG nova.virt.hardware [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1175.585311] env[67893]: DEBUG nova.virt.hardware [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1175.585514] env[67893]: DEBUG nova.virt.hardware [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1175.585862] env[67893]: DEBUG nova.virt.hardware [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1175.586680] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf63c67b-36cc-4970-a3ff-f9fc0ba98c9f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1175.594757] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6892d1a1-4d3a-4fd0-98c1-a5b8eeacf339 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1175.935184] env[67893]: DEBUG nova.network.neutron [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Successfully created port: 28762914-c95b-447a-9cab-1bdefd5f89bd {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1176.583199] env[67893]: DEBUG nova.network.neutron [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Successfully updated port: 28762914-c95b-447a-9cab-1bdefd5f89bd {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1176.596692] env[67893]: DEBUG oslo_concurrency.lockutils [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "refresh_cache-5ede1991-efee-4c34-af5b-ce71f67456ef" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1176.596692] env[67893]: DEBUG oslo_concurrency.lockutils [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquired lock "refresh_cache-5ede1991-efee-4c34-af5b-ce71f67456ef" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1176.596692] env[67893]: DEBUG nova.network.neutron [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1176.632895] env[67893]: DEBUG nova.network.neutron [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1176.803368] env[67893]: DEBUG nova.network.neutron [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Updating instance_info_cache with network_info: [{"id": "28762914-c95b-447a-9cab-1bdefd5f89bd", "address": "fa:16:3e:5c:7d:af", "network": {"id": "b5038471-f3b2-4f1f-b2f9-62effa71f1aa", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1405799721-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7a19d9bde3814325847c06cec1af09b7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "89ef02af-c508-432f-ae29-3a219701d584", "external-id": "nsx-vlan-transportzone-313", "segmentation_id": 313, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap28762914-c9", "ovs_interfaceid": "28762914-c95b-447a-9cab-1bdefd5f89bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1176.814061] env[67893]: DEBUG oslo_concurrency.lockutils [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Releasing lock "refresh_cache-5ede1991-efee-4c34-af5b-ce71f67456ef" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1176.814375] env[67893]: DEBUG nova.compute.manager [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Instance network_info: |[{"id": "28762914-c95b-447a-9cab-1bdefd5f89bd", "address": "fa:16:3e:5c:7d:af", "network": {"id": "b5038471-f3b2-4f1f-b2f9-62effa71f1aa", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1405799721-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7a19d9bde3814325847c06cec1af09b7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "89ef02af-c508-432f-ae29-3a219701d584", "external-id": "nsx-vlan-transportzone-313", "segmentation_id": 313, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap28762914-c9", "ovs_interfaceid": "28762914-c95b-447a-9cab-1bdefd5f89bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1176.814762] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:5c:7d:af', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '89ef02af-c508-432f-ae29-3a219701d584', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '28762914-c95b-447a-9cab-1bdefd5f89bd', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1176.822215] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Creating folder: Project (7a19d9bde3814325847c06cec1af09b7). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1176.822738] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-92309bf1-8069-45c3-bf76-1f9f770bc04a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1176.833628] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Created folder: Project (7a19d9bde3814325847c06cec1af09b7) in parent group-v689771. [ 1176.833798] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Creating folder: Instances. Parent ref: group-v689837. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1176.834076] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f2e4a6c5-b399-4c1e-a326-3c6481e5fee8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1176.842790] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Created folder: Instances in parent group-v689837. [ 1176.843663] env[67893]: DEBUG oslo.service.loopingcall [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1176.843663] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1176.843663] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f41871b0-62ef-4257-8a11-eca4f0e1d64c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1176.858187] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1176.858349] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Cleaning up deleted instances {{(pid=67893) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1176.865148] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1176.865148] env[67893]: value = "task-3455397" [ 1176.865148] env[67893]: _type = "Task" [ 1176.865148] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1176.870744] env[67893]: DEBUG nova.compute.manager [req-7a6ef4ae-502a-4246-9bb5-449c17a7e423 req-d20fdf73-cc7e-441e-a009-9459f5a63332 service nova] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Received event network-vif-plugged-28762914-c95b-447a-9cab-1bdefd5f89bd {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1176.870912] env[67893]: DEBUG oslo_concurrency.lockutils [req-7a6ef4ae-502a-4246-9bb5-449c17a7e423 req-d20fdf73-cc7e-441e-a009-9459f5a63332 service nova] Acquiring lock "5ede1991-efee-4c34-af5b-ce71f67456ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1176.871794] env[67893]: DEBUG oslo_concurrency.lockutils [req-7a6ef4ae-502a-4246-9bb5-449c17a7e423 req-d20fdf73-cc7e-441e-a009-9459f5a63332 service nova] Lock "5ede1991-efee-4c34-af5b-ce71f67456ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1176.871794] env[67893]: DEBUG oslo_concurrency.lockutils [req-7a6ef4ae-502a-4246-9bb5-449c17a7e423 req-d20fdf73-cc7e-441e-a009-9459f5a63332 service nova] Lock "5ede1991-efee-4c34-af5b-ce71f67456ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1176.871794] env[67893]: DEBUG nova.compute.manager [req-7a6ef4ae-502a-4246-9bb5-449c17a7e423 req-d20fdf73-cc7e-441e-a009-9459f5a63332 service nova] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] No waiting events found dispatching network-vif-plugged-28762914-c95b-447a-9cab-1bdefd5f89bd {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1176.872161] env[67893]: WARNING nova.compute.manager [req-7a6ef4ae-502a-4246-9bb5-449c17a7e423 req-d20fdf73-cc7e-441e-a009-9459f5a63332 service nova] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Received unexpected event network-vif-plugged-28762914-c95b-447a-9cab-1bdefd5f89bd for instance with vm_state building and task_state spawning. [ 1176.872161] env[67893]: DEBUG nova.compute.manager [req-7a6ef4ae-502a-4246-9bb5-449c17a7e423 req-d20fdf73-cc7e-441e-a009-9459f5a63332 service nova] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Received event network-changed-28762914-c95b-447a-9cab-1bdefd5f89bd {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1176.872316] env[67893]: DEBUG nova.compute.manager [req-7a6ef4ae-502a-4246-9bb5-449c17a7e423 req-d20fdf73-cc7e-441e-a009-9459f5a63332 service nova] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Refreshing instance network info cache due to event network-changed-28762914-c95b-447a-9cab-1bdefd5f89bd. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1176.872469] env[67893]: DEBUG oslo_concurrency.lockutils [req-7a6ef4ae-502a-4246-9bb5-449c17a7e423 req-d20fdf73-cc7e-441e-a009-9459f5a63332 service nova] Acquiring lock "refresh_cache-5ede1991-efee-4c34-af5b-ce71f67456ef" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1176.872607] env[67893]: DEBUG oslo_concurrency.lockutils [req-7a6ef4ae-502a-4246-9bb5-449c17a7e423 req-d20fdf73-cc7e-441e-a009-9459f5a63332 service nova] Acquired lock "refresh_cache-5ede1991-efee-4c34-af5b-ce71f67456ef" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1176.872762] env[67893]: DEBUG nova.network.neutron [req-7a6ef4ae-502a-4246-9bb5-449c17a7e423 req-d20fdf73-cc7e-441e-a009-9459f5a63332 service nova] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Refreshing network info cache for port 28762914-c95b-447a-9cab-1bdefd5f89bd {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1176.877358] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] There are 0 instances to clean {{(pid=67893) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1176.880234] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455397, 'name': CreateVM_Task} progress is 5%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1177.287723] env[67893]: DEBUG nova.network.neutron [req-7a6ef4ae-502a-4246-9bb5-449c17a7e423 req-d20fdf73-cc7e-441e-a009-9459f5a63332 service nova] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Updated VIF entry in instance network info cache for port 28762914-c95b-447a-9cab-1bdefd5f89bd. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1177.288085] env[67893]: DEBUG nova.network.neutron [req-7a6ef4ae-502a-4246-9bb5-449c17a7e423 req-d20fdf73-cc7e-441e-a009-9459f5a63332 service nova] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Updating instance_info_cache with network_info: [{"id": "28762914-c95b-447a-9cab-1bdefd5f89bd", "address": "fa:16:3e:5c:7d:af", "network": {"id": "b5038471-f3b2-4f1f-b2f9-62effa71f1aa", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1405799721-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7a19d9bde3814325847c06cec1af09b7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "89ef02af-c508-432f-ae29-3a219701d584", "external-id": "nsx-vlan-transportzone-313", "segmentation_id": 313, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap28762914-c9", "ovs_interfaceid": "28762914-c95b-447a-9cab-1bdefd5f89bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1177.297725] env[67893]: DEBUG oslo_concurrency.lockutils [req-7a6ef4ae-502a-4246-9bb5-449c17a7e423 req-d20fdf73-cc7e-441e-a009-9459f5a63332 service nova] Releasing lock "refresh_cache-5ede1991-efee-4c34-af5b-ce71f67456ef" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1177.375060] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455397, 'name': CreateVM_Task, 'duration_secs': 0.310383} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1177.375243] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1177.375902] env[67893]: DEBUG oslo_concurrency.lockutils [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1177.376082] env[67893]: DEBUG oslo_concurrency.lockutils [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1177.376405] env[67893]: DEBUG oslo_concurrency.lockutils [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1177.376709] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6d72637a-70e4-4367-a0ad-f33799ba47e4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1177.381443] env[67893]: DEBUG oslo_vmware.api [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Waiting for the task: (returnval){ [ 1177.381443] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]528aded3-dc72-a2e3-04c5-ace3cb14a26d" [ 1177.381443] env[67893]: _type = "Task" [ 1177.381443] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1177.396276] env[67893]: DEBUG oslo_vmware.api [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]528aded3-dc72-a2e3-04c5-ace3cb14a26d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1177.894241] env[67893]: DEBUG oslo_concurrency.lockutils [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1177.894501] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1177.894715] env[67893]: DEBUG oslo_concurrency.lockutils [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1178.859470] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1178.860063] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Cleaning up deleted instances with incomplete migration {{(pid=67893) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1180.872876] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1181.859591] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1182.107053] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Acquiring lock "1068cd1b-317e-42d5-b348-5bfdbb2b4dc0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1182.107502] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Lock "1068cd1b-317e-42d5-b348-5bfdbb2b4dc0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1182.859044] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1183.865401] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1183.865708] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1183.865708] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1183.887224] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1183.887379] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1183.887508] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1183.887634] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1183.887756] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1183.887879] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1183.887999] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1183.888130] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1183.888254] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1183.888360] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1183.888480] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1184.450232] env[67893]: DEBUG oslo_concurrency.lockutils [None req-e8f51f6a-95ae-4c4e-89dd-e71dcdb2c5a7 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "5ede1991-efee-4c34-af5b-ce71f67456ef" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1184.858264] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1184.878900] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1184.878900] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1185.859613] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1185.859613] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1186.858949] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1186.858949] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1186.874143] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1186.874143] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1186.874143] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1186.874143] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1186.874757] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0bd94cd-f496-451c-b62a-794fddf3537f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1186.883765] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cd2d6c7-d313-4941-87bc-b55daf497dbe {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1186.898955] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c85c75d2-8c29-4f50-8f11-3499ed3d1711 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1186.905429] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb76d033-40e2-44e3-a31b-5bba7e5a8c3c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1186.935033] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180991MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1186.936526] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1186.936921] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1187.092146] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2eb8d698-9436-4e91-bd10-5f5200415144 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1187.092358] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1187.092497] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1187.092621] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a9656a7e-8a7b-489e-9990-097c1e93e535 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1187.092739] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance fcae7119-6233-4a52-9e52-1147f2b10ddc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1187.092856] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2553f3c0-0988-4e11-a138-7e5f71e71f48 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1187.092973] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c05df6c1-e4c9-4276-9981-e80e584d540c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1187.093101] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5a24adaf-bced-4488-9ccb-fc996b2ba154 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1187.093261] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance efdb0a7e-403d-4de5-8c09-72b9c8f9cd79 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1187.093386] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5ede1991-efee-4c34-af5b-ce71f67456ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1187.105451] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance b3d31ca3-9a7a-49d0-955f-1e12808bf11f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1187.115532] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 14f7c0cf-cbf0-4090-89a9-45fe4485cf31 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1187.124724] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance d41abc6b-6519-4994-aa17-6b6bd94c93d9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1187.134086] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 42938110-2d23-432a-bdb2-30750dac90b4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1187.143026] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 8dbbc2e6-9993-4bf0-b66b-6e685789221c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1187.152619] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 783b7968-c130-47f5-9ad3-459d0e7eb746 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1187.161844] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 1591ce78-4293-4d03-be3f-a2cb552f51f7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1187.171624] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 46d7643f-00ab-4953-9a4c-e07b96615f2a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1187.180926] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance dd596db2-a53c-4609-a1da-6db1ec79846e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1187.191801] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 039d691f-31fe-4020-90aa-82905198e13d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1187.200720] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance f7bcf0fe-9569-4b61-be9e-c29f4116cb11 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1187.210340] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 0363316a-cf39-4741-baa9-a040d7486df2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1187.220632] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 9dce8f0a-8fbe-43a5-af0b-ab9f76055bef has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1187.232151] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1187.232426] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1187.232559] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1187.249379] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Refreshing inventories for resource provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1187.265316] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Updating ProviderTree inventory for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1187.265615] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Updating inventory in ProviderTree for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1187.277625] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Refreshing aggregate associations for resource provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57, aggregates: None {{(pid=67893) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1187.296448] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Refreshing trait associations for resource provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO {{(pid=67893) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1187.631697] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d4fad07-288d-4f0f-9c66-2456698a3386 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1187.639007] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f9c3c6a-a373-4f6e-92d5-9b05e397fce6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1187.668248] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d11a6ca-8fc5-40f7-93b5-d6deb71dd0b5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1187.675215] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a20a7c0-3b56-4625-a0bb-3077e325c8f3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1187.689144] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1187.698608] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1187.711832] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1187.712017] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.775s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1188.711854] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1190.765795] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._sync_power_states {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1190.788878] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Getting list of instances from cluster (obj){ [ 1190.788878] env[67893]: value = "domain-c8" [ 1190.788878] env[67893]: _type = "ClusterComputeResource" [ 1190.788878] env[67893]: } {{(pid=67893) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1190.790165] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3697d6f3-4d4f-4aa1-8cb0-6e63142a6fe6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1190.807308] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Got total of 10 instances {{(pid=67893) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1190.807483] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Triggering sync for uuid 2eb8d698-9436-4e91-bd10-5f5200415144 {{(pid=67893) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1190.807692] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Triggering sync for uuid 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff {{(pid=67893) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1190.807824] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Triggering sync for uuid c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e {{(pid=67893) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1190.807980] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Triggering sync for uuid a9656a7e-8a7b-489e-9990-097c1e93e535 {{(pid=67893) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1190.808149] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Triggering sync for uuid fcae7119-6233-4a52-9e52-1147f2b10ddc {{(pid=67893) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1190.808302] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Triggering sync for uuid 2553f3c0-0988-4e11-a138-7e5f71e71f48 {{(pid=67893) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1190.808467] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Triggering sync for uuid c05df6c1-e4c9-4276-9981-e80e584d540c {{(pid=67893) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1190.808617] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Triggering sync for uuid 5a24adaf-bced-4488-9ccb-fc996b2ba154 {{(pid=67893) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1190.808761] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Triggering sync for uuid efdb0a7e-403d-4de5-8c09-72b9c8f9cd79 {{(pid=67893) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1190.808903] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Triggering sync for uuid 5ede1991-efee-4c34-af5b-ce71f67456ef {{(pid=67893) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1190.809221] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "2eb8d698-9436-4e91-bd10-5f5200415144" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1190.809462] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1190.809684] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1190.809883] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "a9656a7e-8a7b-489e-9990-097c1e93e535" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1190.810107] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "fcae7119-6233-4a52-9e52-1147f2b10ddc" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1190.810309] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "2553f3c0-0988-4e11-a138-7e5f71e71f48" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1190.810504] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "c05df6c1-e4c9-4276-9981-e80e584d540c" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1190.810696] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "5a24adaf-bced-4488-9ccb-fc996b2ba154" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1190.810884] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "efdb0a7e-403d-4de5-8c09-72b9c8f9cd79" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1190.811086] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "5ede1991-efee-4c34-af5b-ce71f67456ef" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1222.387553] env[67893]: WARNING oslo_vmware.rw_handles [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1222.387553] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1222.387553] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1222.387553] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1222.387553] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1222.387553] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 1222.387553] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1222.387553] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1222.387553] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1222.387553] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1222.387553] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1222.387553] env[67893]: ERROR oslo_vmware.rw_handles [ 1222.388210] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/8218b43b-57af-45c6-ab38-5dae24625ca7/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1222.389887] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1222.390148] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Copying Virtual Disk [datastore1] vmware_temp/8218b43b-57af-45c6-ab38-5dae24625ca7/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/8218b43b-57af-45c6-ab38-5dae24625ca7/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1222.390444] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-8a0ddf2f-ded7-4c9f-954a-a0643ed99f13 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1222.398718] env[67893]: DEBUG oslo_vmware.api [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Waiting for the task: (returnval){ [ 1222.398718] env[67893]: value = "task-3455398" [ 1222.398718] env[67893]: _type = "Task" [ 1222.398718] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1222.406965] env[67893]: DEBUG oslo_vmware.api [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Task: {'id': task-3455398, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1222.909349] env[67893]: DEBUG oslo_vmware.exceptions [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1222.909642] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1222.910190] env[67893]: ERROR nova.compute.manager [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1222.910190] env[67893]: Faults: ['InvalidArgument'] [ 1222.910190] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Traceback (most recent call last): [ 1222.910190] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1222.910190] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] yield resources [ 1222.910190] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1222.910190] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] self.driver.spawn(context, instance, image_meta, [ 1222.910190] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1222.910190] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1222.910190] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1222.910190] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] self._fetch_image_if_missing(context, vi) [ 1222.910190] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1222.910190] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] image_cache(vi, tmp_image_ds_loc) [ 1222.910190] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1222.910190] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] vm_util.copy_virtual_disk( [ 1222.910190] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1222.910190] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] session._wait_for_task(vmdk_copy_task) [ 1222.910190] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1222.910190] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] return self.wait_for_task(task_ref) [ 1222.910190] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1222.910190] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] return evt.wait() [ 1222.910190] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1222.910190] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] result = hub.switch() [ 1222.910190] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1222.910190] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] return self.greenlet.switch() [ 1222.910190] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1222.910190] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] self.f(*self.args, **self.kw) [ 1222.910190] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1222.910190] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] raise exceptions.translate_fault(task_info.error) [ 1222.910190] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1222.910190] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Faults: ['InvalidArgument'] [ 1222.910190] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] [ 1222.911152] env[67893]: INFO nova.compute.manager [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Terminating instance [ 1222.912048] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1222.912270] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1222.912507] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9182236c-2fe7-4336-8265-85548388571d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1222.915638] env[67893]: DEBUG nova.compute.manager [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1222.915832] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1222.916568] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0b45d4a-d59f-40b0-b44c-48152d515c9c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1222.923927] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1222.924167] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-894be4a8-6dbe-4353-b8fc-01d4d71105c4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1222.926440] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1222.926612] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1222.927587] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-32fd9601-6a9c-4ee6-bbf0-b44935f18781 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1222.932170] env[67893]: DEBUG oslo_vmware.api [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Waiting for the task: (returnval){ [ 1222.932170] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]5297be58-681d-51b1-5a1b-a0a908f357c6" [ 1222.932170] env[67893]: _type = "Task" [ 1222.932170] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1222.946899] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1222.947677] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Creating directory with path [datastore1] vmware_temp/33a01613-45e7-445d-9f74-7faced02a204/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1222.947677] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-42ff1028-8f7a-4a8d-a01d-442dc09c93d0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1222.968020] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Created directory with path [datastore1] vmware_temp/33a01613-45e7-445d-9f74-7faced02a204/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1222.968235] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Fetch image to [datastore1] vmware_temp/33a01613-45e7-445d-9f74-7faced02a204/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1222.968408] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/33a01613-45e7-445d-9f74-7faced02a204/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1222.969181] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2c2ddc6-36c6-4b89-82a5-e4bc6fd1a420 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1222.976067] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5de4c227-f63b-4178-aa62-7f16a78dbdf4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1222.984757] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-823c687b-febf-42e6-99b1-b3150efc5252 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1223.017124] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12ca3323-dcfa-4bf0-a892-a953d34e265f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1223.019568] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1223.019763] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1223.019934] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Deleting the datastore file [datastore1] 2eb8d698-9436-4e91-bd10-5f5200415144 {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1223.020177] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b083a7f8-bc86-400a-a6db-f9321e390858 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1223.026791] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7838d82b-c4ff-4410-b5ff-fbabf8d9b101 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1223.028422] env[67893]: DEBUG oslo_vmware.api [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Waiting for the task: (returnval){ [ 1223.028422] env[67893]: value = "task-3455400" [ 1223.028422] env[67893]: _type = "Task" [ 1223.028422] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1223.035426] env[67893]: DEBUG oslo_vmware.api [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Task: {'id': task-3455400, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1223.048496] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1223.098530] env[67893]: DEBUG oslo_vmware.rw_handles [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/33a01613-45e7-445d-9f74-7faced02a204/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1223.157485] env[67893]: DEBUG oslo_vmware.rw_handles [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1223.157688] env[67893]: DEBUG oslo_vmware.rw_handles [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/33a01613-45e7-445d-9f74-7faced02a204/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1223.539182] env[67893]: DEBUG oslo_vmware.api [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Task: {'id': task-3455400, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068687} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1223.539484] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1223.539725] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1223.539829] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1223.539980] env[67893]: INFO nova.compute.manager [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1223.542259] env[67893]: DEBUG nova.compute.claims [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1223.542451] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1223.542678] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1223.858815] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8242b1bf-8503-42fe-9fc3-24fa6f482056 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1223.866076] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7f71070-86bf-4492-8e1f-f4cc223a8029 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1223.895030] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8727e183-f8cf-439f-a4fb-e06a8b91e404 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1223.902065] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88a1839d-e625-4c23-a23e-3afae16ab81f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1223.914780] env[67893]: DEBUG nova.compute.provider_tree [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1223.924382] env[67893]: DEBUG nova.scheduler.client.report [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1223.938284] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.395s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1223.938817] env[67893]: ERROR nova.compute.manager [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1223.938817] env[67893]: Faults: ['InvalidArgument'] [ 1223.938817] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Traceback (most recent call last): [ 1223.938817] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1223.938817] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] self.driver.spawn(context, instance, image_meta, [ 1223.938817] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1223.938817] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1223.938817] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1223.938817] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] self._fetch_image_if_missing(context, vi) [ 1223.938817] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1223.938817] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] image_cache(vi, tmp_image_ds_loc) [ 1223.938817] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1223.938817] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] vm_util.copy_virtual_disk( [ 1223.938817] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1223.938817] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] session._wait_for_task(vmdk_copy_task) [ 1223.938817] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1223.938817] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] return self.wait_for_task(task_ref) [ 1223.938817] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1223.938817] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] return evt.wait() [ 1223.938817] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1223.938817] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] result = hub.switch() [ 1223.938817] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1223.938817] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] return self.greenlet.switch() [ 1223.938817] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1223.938817] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] self.f(*self.args, **self.kw) [ 1223.938817] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1223.938817] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] raise exceptions.translate_fault(task_info.error) [ 1223.938817] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1223.938817] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Faults: ['InvalidArgument'] [ 1223.938817] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] [ 1223.939730] env[67893]: DEBUG nova.compute.utils [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1223.940935] env[67893]: DEBUG nova.compute.manager [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Build of instance 2eb8d698-9436-4e91-bd10-5f5200415144 was re-scheduled: A specified parameter was not correct: fileType [ 1223.940935] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1223.941326] env[67893]: DEBUG nova.compute.manager [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1223.941491] env[67893]: DEBUG nova.compute.manager [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1223.941658] env[67893]: DEBUG nova.compute.manager [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1223.941855] env[67893]: DEBUG nova.network.neutron [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1224.072417] env[67893]: DEBUG neutronclient.v2_0.client [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67893) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1224.074600] env[67893]: ERROR nova.compute.manager [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Traceback (most recent call last): [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] self.driver.spawn(context, instance, image_meta, [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] self._fetch_image_if_missing(context, vi) [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] image_cache(vi, tmp_image_ds_loc) [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] vm_util.copy_virtual_disk( [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] session._wait_for_task(vmdk_copy_task) [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] return self.wait_for_task(task_ref) [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] return evt.wait() [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] result = hub.switch() [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] return self.greenlet.switch() [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] self.f(*self.args, **self.kw) [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] raise exceptions.translate_fault(task_info.error) [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Faults: ['InvalidArgument'] [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] During handling of the above exception, another exception occurred: [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Traceback (most recent call last): [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/compute/manager.py", line 2430, in _do_build_and_run_instance [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] self._build_and_run_instance(context, instance, image, [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/compute/manager.py", line 2722, in _build_and_run_instance [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] raise exception.RescheduledException( [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] nova.exception.RescheduledException: Build of instance 2eb8d698-9436-4e91-bd10-5f5200415144 was re-scheduled: A specified parameter was not correct: fileType [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Faults: ['InvalidArgument'] [ 1224.074600] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] During handling of the above exception, another exception occurred: [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Traceback (most recent call last): [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] ret = obj(*args, **kwargs) [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] exception_handler_v20(status_code, error_body) [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] raise client_exc(message=error_message, [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Neutron server returns request_ids: ['req-1daa1e84-9e74-46f5-9151-75f0ea2ef99e'] [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] During handling of the above exception, another exception occurred: [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Traceback (most recent call last): [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/compute/manager.py", line 3019, in _cleanup_allocated_networks [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] self._deallocate_network(context, instance, requested_networks) [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] self.network_api.deallocate_for_instance( [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] data = neutron.list_ports(**search_opts) [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] ret = obj(*args, **kwargs) [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] return self.list('ports', self.ports_path, retrieve_all, [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] ret = obj(*args, **kwargs) [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] for r in self._pagination(collection, path, **params): [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] res = self.get(path, params=params) [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] ret = obj(*args, **kwargs) [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] return self.retry_request("GET", action, body=body, [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] ret = obj(*args, **kwargs) [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1224.075781] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] return self.do_request(method, action, body=body, [ 1224.076959] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1224.076959] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] ret = obj(*args, **kwargs) [ 1224.076959] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1224.076959] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] self._handle_fault_response(status_code, replybody, resp) [ 1224.076959] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1224.076959] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] raise exception.Unauthorized() [ 1224.076959] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] nova.exception.Unauthorized: Not authorized. [ 1224.076959] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] [ 1224.129300] env[67893]: INFO nova.scheduler.client.report [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Deleted allocations for instance 2eb8d698-9436-4e91-bd10-5f5200415144 [ 1224.149431] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3672c443-f6a5-410b-ac26-93a12d0c9242 tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Lock "2eb8d698-9436-4e91-bd10-5f5200415144" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 566.911s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1224.149641] env[67893]: DEBUG oslo_concurrency.lockutils [None req-186ba630-f253-4bd4-9a5b-92c8fe71a02f tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Lock "2eb8d698-9436-4e91-bd10-5f5200415144" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 368.417s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1224.149707] env[67893]: DEBUG oslo_concurrency.lockutils [None req-186ba630-f253-4bd4-9a5b-92c8fe71a02f tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Acquiring lock "2eb8d698-9436-4e91-bd10-5f5200415144-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1224.150299] env[67893]: DEBUG oslo_concurrency.lockutils [None req-186ba630-f253-4bd4-9a5b-92c8fe71a02f tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Lock "2eb8d698-9436-4e91-bd10-5f5200415144-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1224.150299] env[67893]: DEBUG oslo_concurrency.lockutils [None req-186ba630-f253-4bd4-9a5b-92c8fe71a02f tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Lock "2eb8d698-9436-4e91-bd10-5f5200415144-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1224.151921] env[67893]: INFO nova.compute.manager [None req-186ba630-f253-4bd4-9a5b-92c8fe71a02f tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Terminating instance [ 1224.153746] env[67893]: DEBUG nova.compute.manager [None req-186ba630-f253-4bd4-9a5b-92c8fe71a02f tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1224.154027] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-186ba630-f253-4bd4-9a5b-92c8fe71a02f tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1224.154424] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a157b414-6cfb-48dd-9989-59b37620880b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1224.164011] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-faaf60b6-08bb-4d90-94ed-16b86f5104e8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1224.174967] env[67893]: DEBUG nova.compute.manager [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1224.198290] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-186ba630-f253-4bd4-9a5b-92c8fe71a02f tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 2eb8d698-9436-4e91-bd10-5f5200415144 could not be found. [ 1224.198290] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-186ba630-f253-4bd4-9a5b-92c8fe71a02f tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1224.198290] env[67893]: INFO nova.compute.manager [None req-186ba630-f253-4bd4-9a5b-92c8fe71a02f tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1224.198290] env[67893]: DEBUG oslo.service.loopingcall [None req-186ba630-f253-4bd4-9a5b-92c8fe71a02f tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1224.198290] env[67893]: DEBUG nova.compute.manager [-] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1224.198290] env[67893]: DEBUG nova.network.neutron [-] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1224.220931] env[67893]: DEBUG oslo_concurrency.lockutils [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1224.221215] env[67893]: DEBUG oslo_concurrency.lockutils [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1224.222820] env[67893]: INFO nova.compute.claims [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1224.296645] env[67893]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67893) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1224.296645] env[67893]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-5cb6ac73-b663-4d5d-8f62-eb2c593a2679'] [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1224.296720] env[67893]: ERROR oslo.service.loopingcall [ 1224.297968] env[67893]: ERROR nova.compute.manager [None req-186ba630-f253-4bd4-9a5b-92c8fe71a02f tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1224.329151] env[67893]: ERROR nova.compute.manager [None req-186ba630-f253-4bd4-9a5b-92c8fe71a02f tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Traceback (most recent call last): [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] ret = obj(*args, **kwargs) [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] exception_handler_v20(status_code, error_body) [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] raise client_exc(message=error_message, [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Neutron server returns request_ids: ['req-5cb6ac73-b663-4d5d-8f62-eb2c593a2679'] [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] During handling of the above exception, another exception occurred: [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Traceback (most recent call last): [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] self._delete_instance(context, instance, bdms) [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] self._shutdown_instance(context, instance, bdms) [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] self._try_deallocate_network(context, instance, requested_networks) [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] with excutils.save_and_reraise_exception(): [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] self.force_reraise() [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] raise self.value [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] _deallocate_network_with_retries() [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] return evt.wait() [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] result = hub.switch() [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] return self.greenlet.switch() [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] result = func(*self.args, **self.kw) [ 1224.329151] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] result = f(*args, **kwargs) [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] self._deallocate_network( [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] self.network_api.deallocate_for_instance( [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] data = neutron.list_ports(**search_opts) [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] ret = obj(*args, **kwargs) [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] return self.list('ports', self.ports_path, retrieve_all, [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] ret = obj(*args, **kwargs) [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] for r in self._pagination(collection, path, **params): [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] res = self.get(path, params=params) [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] ret = obj(*args, **kwargs) [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] return self.retry_request("GET", action, body=body, [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] ret = obj(*args, **kwargs) [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] return self.do_request(method, action, body=body, [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] ret = obj(*args, **kwargs) [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] self._handle_fault_response(status_code, replybody, resp) [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1224.330374] env[67893]: ERROR nova.compute.manager [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] [ 1224.356382] env[67893]: DEBUG oslo_concurrency.lockutils [None req-186ba630-f253-4bd4-9a5b-92c8fe71a02f tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Lock "2eb8d698-9436-4e91-bd10-5f5200415144" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.207s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1224.357487] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "2eb8d698-9436-4e91-bd10-5f5200415144" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 33.548s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1224.357680] env[67893]: INFO nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] During sync_power_state the instance has a pending task (deleting). Skip. [ 1224.357852] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "2eb8d698-9436-4e91-bd10-5f5200415144" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1224.408653] env[67893]: INFO nova.compute.manager [None req-186ba630-f253-4bd4-9a5b-92c8fe71a02f tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] [instance: 2eb8d698-9436-4e91-bd10-5f5200415144] Successfully reverted task state from None on failure for instance. [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server [None req-186ba630-f253-4bd4-9a5b-92c8fe71a02f tempest-ListImageFiltersTestJSON-588175274 tempest-ListImageFiltersTestJSON-588175274-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-5cb6ac73-b663-4d5d-8f62-eb2c593a2679'] [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server raise self.value [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server raise self.value [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server raise self.value [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1224.413014] env[67893]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server raise self.value [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server raise self.value [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1224.414647] env[67893]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1224.416176] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1224.416176] env[67893]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1224.416176] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1224.416176] env[67893]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1224.416176] env[67893]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1224.416176] env[67893]: ERROR oslo_messaging.rpc.server [ 1224.528438] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffe63c5a-8ba2-4280-aa4d-5466722d598a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1224.536436] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a235447d-edb3-4e82-9978-8650f4773a29 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1224.571952] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7241000e-f593-4876-8ac0-d6b2600e08a6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1224.579132] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2132c40f-138e-4c9c-b9b3-61ee06ece1b5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1224.592304] env[67893]: DEBUG nova.compute.provider_tree [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1224.601296] env[67893]: DEBUG nova.scheduler.client.report [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1224.615459] env[67893]: DEBUG oslo_concurrency.lockutils [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.394s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1224.615935] env[67893]: DEBUG nova.compute.manager [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1224.648195] env[67893]: DEBUG nova.compute.utils [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1224.649760] env[67893]: DEBUG nova.compute.manager [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1224.650124] env[67893]: DEBUG nova.network.neutron [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1224.658931] env[67893]: DEBUG nova.compute.manager [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1224.723217] env[67893]: DEBUG nova.policy [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '894285baafaf410ea301f676b78c45f4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b439a6039a714a6fabd3c0477629d3c1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 1224.729981] env[67893]: DEBUG nova.compute.manager [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1224.755619] env[67893]: DEBUG nova.virt.hardware [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1224.755861] env[67893]: DEBUG nova.virt.hardware [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1224.756029] env[67893]: DEBUG nova.virt.hardware [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1224.756222] env[67893]: DEBUG nova.virt.hardware [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1224.756368] env[67893]: DEBUG nova.virt.hardware [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1224.756517] env[67893]: DEBUG nova.virt.hardware [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1224.756727] env[67893]: DEBUG nova.virt.hardware [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1224.756888] env[67893]: DEBUG nova.virt.hardware [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1224.757067] env[67893]: DEBUG nova.virt.hardware [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1224.757234] env[67893]: DEBUG nova.virt.hardware [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1224.757405] env[67893]: DEBUG nova.virt.hardware [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1224.758271] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97bfea78-778e-42c7-bb5d-bb69b3e9cef4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1224.767331] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf1bdd05-cbca-4e5d-b4b5-a7789613f166 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1225.118379] env[67893]: DEBUG nova.network.neutron [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Successfully created port: d1a17be3-b7ee-4c35-adc6-aa03253d1ca7 {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1225.681540] env[67893]: DEBUG nova.compute.manager [req-e037a5ef-5a0f-45a7-8eb3-df165509807c req-4f0d9a29-1193-4420-9bd5-93eaa79047ca service nova] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Received event network-vif-plugged-d1a17be3-b7ee-4c35-adc6-aa03253d1ca7 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1225.681832] env[67893]: DEBUG oslo_concurrency.lockutils [req-e037a5ef-5a0f-45a7-8eb3-df165509807c req-4f0d9a29-1193-4420-9bd5-93eaa79047ca service nova] Acquiring lock "b3d31ca3-9a7a-49d0-955f-1e12808bf11f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1225.682052] env[67893]: DEBUG oslo_concurrency.lockutils [req-e037a5ef-5a0f-45a7-8eb3-df165509807c req-4f0d9a29-1193-4420-9bd5-93eaa79047ca service nova] Lock "b3d31ca3-9a7a-49d0-955f-1e12808bf11f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1225.682243] env[67893]: DEBUG oslo_concurrency.lockutils [req-e037a5ef-5a0f-45a7-8eb3-df165509807c req-4f0d9a29-1193-4420-9bd5-93eaa79047ca service nova] Lock "b3d31ca3-9a7a-49d0-955f-1e12808bf11f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1225.682424] env[67893]: DEBUG nova.compute.manager [req-e037a5ef-5a0f-45a7-8eb3-df165509807c req-4f0d9a29-1193-4420-9bd5-93eaa79047ca service nova] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] No waiting events found dispatching network-vif-plugged-d1a17be3-b7ee-4c35-adc6-aa03253d1ca7 {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1225.682586] env[67893]: WARNING nova.compute.manager [req-e037a5ef-5a0f-45a7-8eb3-df165509807c req-4f0d9a29-1193-4420-9bd5-93eaa79047ca service nova] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Received unexpected event network-vif-plugged-d1a17be3-b7ee-4c35-adc6-aa03253d1ca7 for instance with vm_state building and task_state spawning. [ 1225.846185] env[67893]: DEBUG nova.network.neutron [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Successfully updated port: d1a17be3-b7ee-4c35-adc6-aa03253d1ca7 {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1225.861298] env[67893]: DEBUG oslo_concurrency.lockutils [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquiring lock "refresh_cache-b3d31ca3-9a7a-49d0-955f-1e12808bf11f" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1225.861718] env[67893]: DEBUG oslo_concurrency.lockutils [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquired lock "refresh_cache-b3d31ca3-9a7a-49d0-955f-1e12808bf11f" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1225.861718] env[67893]: DEBUG nova.network.neutron [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1225.913970] env[67893]: DEBUG nova.network.neutron [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1226.292753] env[67893]: DEBUG nova.network.neutron [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Updating instance_info_cache with network_info: [{"id": "d1a17be3-b7ee-4c35-adc6-aa03253d1ca7", "address": "fa:16:3e:f7:7d:6c", "network": {"id": "3269c624-7a70-494c-85bc-8230ffbbab83", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-740576182-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b439a6039a714a6fabd3c0477629d3c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd1a17be3-b7", "ovs_interfaceid": "d1a17be3-b7ee-4c35-adc6-aa03253d1ca7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1226.304946] env[67893]: DEBUG oslo_concurrency.lockutils [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Releasing lock "refresh_cache-b3d31ca3-9a7a-49d0-955f-1e12808bf11f" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1226.305259] env[67893]: DEBUG nova.compute.manager [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Instance network_info: |[{"id": "d1a17be3-b7ee-4c35-adc6-aa03253d1ca7", "address": "fa:16:3e:f7:7d:6c", "network": {"id": "3269c624-7a70-494c-85bc-8230ffbbab83", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-740576182-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b439a6039a714a6fabd3c0477629d3c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd1a17be3-b7", "ovs_interfaceid": "d1a17be3-b7ee-4c35-adc6-aa03253d1ca7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1226.305720] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f7:7d:6c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'fa01fe1a-83b6-4c10-af75-00ddb17f9bbf', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd1a17be3-b7ee-4c35-adc6-aa03253d1ca7', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1226.313138] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Creating folder: Project (b439a6039a714a6fabd3c0477629d3c1). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1226.313648] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b74f3f3b-e06b-4413-b4d1-4a64683a21f4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1226.327092] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Created folder: Project (b439a6039a714a6fabd3c0477629d3c1) in parent group-v689771. [ 1226.327092] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Creating folder: Instances. Parent ref: group-v689840. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1226.327092] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-02c9dfa0-54bc-45d2-8c69-ee3a7fd8ab72 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1226.334355] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Created folder: Instances in parent group-v689840. [ 1226.334764] env[67893]: DEBUG oslo.service.loopingcall [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1226.335130] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1226.335463] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1bbcf530-e708-407e-bad8-4442aaea227a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1226.355417] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1226.355417] env[67893]: value = "task-3455403" [ 1226.355417] env[67893]: _type = "Task" [ 1226.355417] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1226.363176] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455403, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1226.865387] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455403, 'name': CreateVM_Task, 'duration_secs': 0.295586} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1226.865691] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1226.866222] env[67893]: DEBUG oslo_concurrency.lockutils [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1226.866384] env[67893]: DEBUG oslo_concurrency.lockutils [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1226.866692] env[67893]: DEBUG oslo_concurrency.lockutils [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1226.866934] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f184514e-a4f2-497a-b40b-d28c2a4b5896 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1226.871395] env[67893]: DEBUG oslo_vmware.api [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Waiting for the task: (returnval){ [ 1226.871395] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52a12286-57bc-345a-22c5-0bb012c9a221" [ 1226.871395] env[67893]: _type = "Task" [ 1226.871395] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1226.878801] env[67893]: DEBUG oslo_vmware.api [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52a12286-57bc-345a-22c5-0bb012c9a221, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1227.381195] env[67893]: DEBUG oslo_concurrency.lockutils [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1227.381499] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1227.381755] env[67893]: DEBUG oslo_concurrency.lockutils [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1227.708270] env[67893]: DEBUG nova.compute.manager [req-ff8319a4-b923-4825-9b01-61176b9cf67b req-ca1ca5ad-d155-4719-b748-1245c6c43381 service nova] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Received event network-changed-d1a17be3-b7ee-4c35-adc6-aa03253d1ca7 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1227.708358] env[67893]: DEBUG nova.compute.manager [req-ff8319a4-b923-4825-9b01-61176b9cf67b req-ca1ca5ad-d155-4719-b748-1245c6c43381 service nova] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Refreshing instance network info cache due to event network-changed-d1a17be3-b7ee-4c35-adc6-aa03253d1ca7. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1227.708515] env[67893]: DEBUG oslo_concurrency.lockutils [req-ff8319a4-b923-4825-9b01-61176b9cf67b req-ca1ca5ad-d155-4719-b748-1245c6c43381 service nova] Acquiring lock "refresh_cache-b3d31ca3-9a7a-49d0-955f-1e12808bf11f" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1227.708662] env[67893]: DEBUG oslo_concurrency.lockutils [req-ff8319a4-b923-4825-9b01-61176b9cf67b req-ca1ca5ad-d155-4719-b748-1245c6c43381 service nova] Acquired lock "refresh_cache-b3d31ca3-9a7a-49d0-955f-1e12808bf11f" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1227.708821] env[67893]: DEBUG nova.network.neutron [req-ff8319a4-b923-4825-9b01-61176b9cf67b req-ca1ca5ad-d155-4719-b748-1245c6c43381 service nova] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Refreshing network info cache for port d1a17be3-b7ee-4c35-adc6-aa03253d1ca7 {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1227.994814] env[67893]: DEBUG nova.network.neutron [req-ff8319a4-b923-4825-9b01-61176b9cf67b req-ca1ca5ad-d155-4719-b748-1245c6c43381 service nova] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Updated VIF entry in instance network info cache for port d1a17be3-b7ee-4c35-adc6-aa03253d1ca7. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1227.995195] env[67893]: DEBUG nova.network.neutron [req-ff8319a4-b923-4825-9b01-61176b9cf67b req-ca1ca5ad-d155-4719-b748-1245c6c43381 service nova] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Updating instance_info_cache with network_info: [{"id": "d1a17be3-b7ee-4c35-adc6-aa03253d1ca7", "address": "fa:16:3e:f7:7d:6c", "network": {"id": "3269c624-7a70-494c-85bc-8230ffbbab83", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-740576182-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b439a6039a714a6fabd3c0477629d3c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd1a17be3-b7", "ovs_interfaceid": "d1a17be3-b7ee-4c35-adc6-aa03253d1ca7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1228.004320] env[67893]: DEBUG oslo_concurrency.lockutils [req-ff8319a4-b923-4825-9b01-61176b9cf67b req-ca1ca5ad-d155-4719-b748-1245c6c43381 service nova] Releasing lock "refresh_cache-b3d31ca3-9a7a-49d0-955f-1e12808bf11f" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1236.888140] env[67893]: DEBUG oslo_concurrency.lockutils [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Acquiring lock "021f1a86-6015-4a22-b501-3ec9079edbec" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1236.888480] env[67893]: DEBUG oslo_concurrency.lockutils [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Lock "021f1a86-6015-4a22-b501-3ec9079edbec" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1240.451896] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b35da648-c1db-48c5-8c47-853072dc757e tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquiring lock "b3d31ca3-9a7a-49d0-955f-1e12808bf11f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1240.904599] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1243.859615] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1243.859903] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1243.859950] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1243.882546] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1243.882710] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1243.882842] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1243.882968] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1243.883108] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1243.883229] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1243.883362] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1243.883457] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1243.883577] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1243.883731] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1243.883855] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1243.884371] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1245.858478] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1245.858836] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1245.858887] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1245.859033] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1247.858620] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1247.858931] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1247.859023] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1247.870322] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1247.872198] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1247.872198] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1247.872198] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1247.872976] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a741f260-6e54-42cf-8a29-bf0b59b70380 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1247.883104] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9a59085-7633-4c51-8feb-4fe538b493df {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1247.899344] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00bdedd8-98f5-4caf-ab0a-9e24e63847ac {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1247.905968] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd2353b3-7857-422f-a56d-43124fe2669e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1247.937886] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180973MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1247.938065] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1247.938267] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1248.040599] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1248.040867] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1248.041067] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a9656a7e-8a7b-489e-9990-097c1e93e535 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1248.041252] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance fcae7119-6233-4a52-9e52-1147f2b10ddc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1248.041430] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2553f3c0-0988-4e11-a138-7e5f71e71f48 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1248.041605] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c05df6c1-e4c9-4276-9981-e80e584d540c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1248.044330] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5a24adaf-bced-4488-9ccb-fc996b2ba154 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1248.044330] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance efdb0a7e-403d-4de5-8c09-72b9c8f9cd79 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1248.044330] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5ede1991-efee-4c34-af5b-ce71f67456ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1248.044330] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance b3d31ca3-9a7a-49d0-955f-1e12808bf11f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1248.056054] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance d41abc6b-6519-4994-aa17-6b6bd94c93d9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1248.067880] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 42938110-2d23-432a-bdb2-30750dac90b4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1248.078242] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 8dbbc2e6-9993-4bf0-b66b-6e685789221c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1248.087974] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 783b7968-c130-47f5-9ad3-459d0e7eb746 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1248.097641] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 1591ce78-4293-4d03-be3f-a2cb552f51f7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1248.107705] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 46d7643f-00ab-4953-9a4c-e07b96615f2a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1248.120228] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance dd596db2-a53c-4609-a1da-6db1ec79846e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1248.131109] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 039d691f-31fe-4020-90aa-82905198e13d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1248.141526] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance f7bcf0fe-9569-4b61-be9e-c29f4116cb11 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1248.151022] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 0363316a-cf39-4741-baa9-a040d7486df2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1248.160433] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 9dce8f0a-8fbe-43a5-af0b-ab9f76055bef has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1248.170214] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1248.182339] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 021f1a86-6015-4a22-b501-3ec9079edbec has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1248.182577] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1248.182723] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1248.520015] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb6ed372-3664-4a13-ac7f-28876f1cfe83 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1248.528034] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30a4360d-c1f6-4530-b12f-1afb4fb48f73 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1248.563362] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1997ca9d-7127-416e-8a90-c48f4ff7a143 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1248.571953] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3a5ea5e-e5cc-4cf1-b43f-f24696abd37f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1248.589637] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1248.600247] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1248.619072] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1248.619299] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.681s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1252.906142] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Acquiring lock "25d67f98-c132-434b-9d22-4569585527eb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1252.906470] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Lock "25d67f98-c132-434b-9d22-4569585527eb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1256.285709] env[67893]: DEBUG oslo_concurrency.lockutils [None req-37daa011-268e-415c-9465-50b3a80b115e tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Acquiring lock "bb9f69b8-d92d-4895-8115-0c436fd51367" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1256.286042] env[67893]: DEBUG oslo_concurrency.lockutils [None req-37daa011-268e-415c-9465-50b3a80b115e tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Lock "bb9f69b8-d92d-4895-8115-0c436fd51367" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1269.785293] env[67893]: WARNING oslo_vmware.rw_handles [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1269.785293] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1269.785293] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1269.785293] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1269.785293] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1269.785293] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 1269.785293] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1269.785293] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1269.785293] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1269.785293] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1269.785293] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1269.785293] env[67893]: ERROR oslo_vmware.rw_handles [ 1269.786025] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/33a01613-45e7-445d-9f74-7faced02a204/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1269.787686] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1269.787928] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Copying Virtual Disk [datastore1] vmware_temp/33a01613-45e7-445d-9f74-7faced02a204/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/33a01613-45e7-445d-9f74-7faced02a204/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1269.788224] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f453c2e9-e629-45b4-8cd3-7e65623ab4f8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1269.796602] env[67893]: DEBUG oslo_vmware.api [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Waiting for the task: (returnval){ [ 1269.796602] env[67893]: value = "task-3455404" [ 1269.796602] env[67893]: _type = "Task" [ 1269.796602] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1269.804455] env[67893]: DEBUG oslo_vmware.api [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Task: {'id': task-3455404, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1270.307117] env[67893]: DEBUG oslo_vmware.exceptions [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1270.307431] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1270.308550] env[67893]: ERROR nova.compute.manager [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1270.308550] env[67893]: Faults: ['InvalidArgument'] [ 1270.308550] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Traceback (most recent call last): [ 1270.308550] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1270.308550] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] yield resources [ 1270.308550] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1270.308550] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] self.driver.spawn(context, instance, image_meta, [ 1270.308550] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1270.308550] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1270.308550] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1270.308550] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] self._fetch_image_if_missing(context, vi) [ 1270.308550] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1270.308550] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] image_cache(vi, tmp_image_ds_loc) [ 1270.308550] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1270.308550] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] vm_util.copy_virtual_disk( [ 1270.308550] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1270.308550] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] session._wait_for_task(vmdk_copy_task) [ 1270.308550] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1270.308550] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] return self.wait_for_task(task_ref) [ 1270.308550] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1270.308550] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] return evt.wait() [ 1270.308550] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1270.308550] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] result = hub.switch() [ 1270.308550] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1270.308550] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] return self.greenlet.switch() [ 1270.308550] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1270.308550] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] self.f(*self.args, **self.kw) [ 1270.308550] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1270.308550] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] raise exceptions.translate_fault(task_info.error) [ 1270.308550] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1270.308550] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Faults: ['InvalidArgument'] [ 1270.308550] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] [ 1270.308550] env[67893]: INFO nova.compute.manager [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Terminating instance [ 1270.310336] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1270.310336] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1270.310336] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-07294dd4-e2e1-48b6-8768-d51f8b4896bf {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1270.312815] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Acquiring lock "refresh_cache-5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1270.312999] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Acquired lock "refresh_cache-5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1270.313194] env[67893]: DEBUG nova.network.neutron [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1270.318020] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1270.318239] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1270.318949] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cbca64c2-8cc5-4d4c-9d23-8449bd4853ad {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1270.325738] env[67893]: DEBUG oslo_vmware.api [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Waiting for the task: (returnval){ [ 1270.325738] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]527078f5-59eb-01c0-10eb-7d4ddc47753d" [ 1270.325738] env[67893]: _type = "Task" [ 1270.325738] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1270.333872] env[67893]: DEBUG oslo_vmware.api [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]527078f5-59eb-01c0-10eb-7d4ddc47753d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1270.425569] env[67893]: DEBUG nova.network.neutron [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1270.499698] env[67893]: DEBUG nova.network.neutron [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1270.509324] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Releasing lock "refresh_cache-5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1270.509803] env[67893]: DEBUG nova.compute.manager [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1270.510031] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1270.511117] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c712c41-0497-4a6d-a930-523791b4b1ec {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1270.519283] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1270.519509] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-02d13d69-f534-4973-8fa1-291ead8cb8b2 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1270.551000] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1270.551239] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1270.551417] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Deleting the datastore file [datastore1] 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1270.551664] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3b771ae1-f75a-43e5-8e0a-e7d840cb5862 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1270.558280] env[67893]: DEBUG oslo_vmware.api [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Waiting for the task: (returnval){ [ 1270.558280] env[67893]: value = "task-3455406" [ 1270.558280] env[67893]: _type = "Task" [ 1270.558280] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1270.567032] env[67893]: DEBUG oslo_vmware.api [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Task: {'id': task-3455406, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1270.837627] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1270.837944] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Creating directory with path [datastore1] vmware_temp/7e89694c-f3be-48fc-a3d4-aa0a4160cbcc/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1270.838182] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-274c6dfa-afe8-4266-b346-eee00f07b7bd {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1270.849269] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Created directory with path [datastore1] vmware_temp/7e89694c-f3be-48fc-a3d4-aa0a4160cbcc/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1270.849422] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Fetch image to [datastore1] vmware_temp/7e89694c-f3be-48fc-a3d4-aa0a4160cbcc/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1270.849678] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/7e89694c-f3be-48fc-a3d4-aa0a4160cbcc/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1270.850432] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d1c9d6e-b1a0-42ff-9171-dfd94238ec03 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1270.857228] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3477504-b12d-4bb0-8597-ff25de244f1f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1270.865904] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20276cdb-a5a2-4b5c-9c38-07cff141839a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1270.895335] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9781c5ca-9c7c-4a68-8f18-f06095a5c1f4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1270.900693] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ac482e1b-41e8-4b27-821b-f02553bd6e1b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1270.920783] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1271.067245] env[67893]: DEBUG oslo_vmware.api [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Task: {'id': task-3455406, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.041821} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1271.069195] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1271.069369] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1271.069543] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1271.069717] env[67893]: INFO nova.compute.manager [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Took 0.56 seconds to destroy the instance on the hypervisor. [ 1271.069956] env[67893]: DEBUG oslo.service.loopingcall [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1271.070353] env[67893]: DEBUG nova.compute.manager [-] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Skipping network deallocation for instance since networking was not requested. {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1271.072674] env[67893]: DEBUG nova.compute.claims [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1271.072862] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1271.073090] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1271.083435] env[67893]: DEBUG oslo_vmware.rw_handles [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7e89694c-f3be-48fc-a3d4-aa0a4160cbcc/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1271.146886] env[67893]: DEBUG oslo_vmware.rw_handles [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1271.147163] env[67893]: DEBUG oslo_vmware.rw_handles [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7e89694c-f3be-48fc-a3d4-aa0a4160cbcc/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1271.406482] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-004de619-c1ce-4421-bb53-58ee5d84a322 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1271.415052] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d61f627a-4bae-4872-868e-8fd404d769fe {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1271.444104] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9093f3fe-f09e-4790-b066-779f99e21b84 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1271.451129] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e346662c-ec04-435c-8b49-35a39f89892f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1271.464016] env[67893]: DEBUG nova.compute.provider_tree [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1271.474467] env[67893]: DEBUG nova.scheduler.client.report [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1271.488405] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.415s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1271.488927] env[67893]: ERROR nova.compute.manager [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1271.488927] env[67893]: Faults: ['InvalidArgument'] [ 1271.488927] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Traceback (most recent call last): [ 1271.488927] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1271.488927] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] self.driver.spawn(context, instance, image_meta, [ 1271.488927] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1271.488927] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1271.488927] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1271.488927] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] self._fetch_image_if_missing(context, vi) [ 1271.488927] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1271.488927] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] image_cache(vi, tmp_image_ds_loc) [ 1271.488927] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1271.488927] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] vm_util.copy_virtual_disk( [ 1271.488927] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1271.488927] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] session._wait_for_task(vmdk_copy_task) [ 1271.488927] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1271.488927] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] return self.wait_for_task(task_ref) [ 1271.488927] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1271.488927] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] return evt.wait() [ 1271.488927] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1271.488927] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] result = hub.switch() [ 1271.488927] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1271.488927] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] return self.greenlet.switch() [ 1271.488927] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1271.488927] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] self.f(*self.args, **self.kw) [ 1271.488927] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1271.488927] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] raise exceptions.translate_fault(task_info.error) [ 1271.488927] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1271.488927] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Faults: ['InvalidArgument'] [ 1271.488927] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] [ 1271.489946] env[67893]: DEBUG nova.compute.utils [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1271.490944] env[67893]: DEBUG nova.compute.manager [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Build of instance 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff was re-scheduled: A specified parameter was not correct: fileType [ 1271.490944] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1271.491345] env[67893]: DEBUG nova.compute.manager [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1271.491585] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Acquiring lock "refresh_cache-5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1271.491732] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Acquired lock "refresh_cache-5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1271.491891] env[67893]: DEBUG nova.network.neutron [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1271.516486] env[67893]: DEBUG nova.network.neutron [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1271.611367] env[67893]: DEBUG nova.network.neutron [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1271.623124] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Releasing lock "refresh_cache-5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1271.623380] env[67893]: DEBUG nova.compute.manager [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1271.623620] env[67893]: DEBUG nova.compute.manager [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Skipping network deallocation for instance since networking was not requested. {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1271.719176] env[67893]: INFO nova.scheduler.client.report [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Deleted allocations for instance 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff [ 1271.742119] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bca7aff2-6538-4937-9c99-623b7e540df3 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Lock "5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 614.289s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1271.743324] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3078c700-ca6c-465a-82fb-b854d5091124 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Lock "5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 415.960s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1271.743601] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3078c700-ca6c-465a-82fb-b854d5091124 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Acquiring lock "5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1271.743841] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3078c700-ca6c-465a-82fb-b854d5091124 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Lock "5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1271.744058] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3078c700-ca6c-465a-82fb-b854d5091124 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Lock "5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1271.746150] env[67893]: INFO nova.compute.manager [None req-3078c700-ca6c-465a-82fb-b854d5091124 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Terminating instance [ 1271.747679] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3078c700-ca6c-465a-82fb-b854d5091124 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Acquiring lock "refresh_cache-5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1271.747875] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3078c700-ca6c-465a-82fb-b854d5091124 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Acquired lock "refresh_cache-5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1271.748069] env[67893]: DEBUG nova.network.neutron [None req-3078c700-ca6c-465a-82fb-b854d5091124 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1271.754745] env[67893]: DEBUG nova.compute.manager [None req-48dbfc0d-a487-436f-a843-7c89d5b6f827 tempest-ServerTagsTestJSON-1065957766 tempest-ServerTagsTestJSON-1065957766-project-member] [instance: 14f7c0cf-cbf0-4090-89a9-45fe4485cf31] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1271.778355] env[67893]: DEBUG nova.network.neutron [None req-3078c700-ca6c-465a-82fb-b854d5091124 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1271.785352] env[67893]: DEBUG nova.compute.manager [None req-48dbfc0d-a487-436f-a843-7c89d5b6f827 tempest-ServerTagsTestJSON-1065957766 tempest-ServerTagsTestJSON-1065957766-project-member] [instance: 14f7c0cf-cbf0-4090-89a9-45fe4485cf31] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1271.809556] env[67893]: DEBUG oslo_concurrency.lockutils [None req-48dbfc0d-a487-436f-a843-7c89d5b6f827 tempest-ServerTagsTestJSON-1065957766 tempest-ServerTagsTestJSON-1065957766-project-member] Lock "14f7c0cf-cbf0-4090-89a9-45fe4485cf31" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 222.018s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1271.819811] env[67893]: DEBUG nova.compute.manager [None req-08aff738-2c74-4d6d-b12a-8c96c6e80fb3 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: d41abc6b-6519-4994-aa17-6b6bd94c93d9] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1271.844771] env[67893]: DEBUG nova.compute.manager [None req-08aff738-2c74-4d6d-b12a-8c96c6e80fb3 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: d41abc6b-6519-4994-aa17-6b6bd94c93d9] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1271.864842] env[67893]: DEBUG oslo_concurrency.lockutils [None req-08aff738-2c74-4d6d-b12a-8c96c6e80fb3 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Lock "d41abc6b-6519-4994-aa17-6b6bd94c93d9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 216.466s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1271.866540] env[67893]: DEBUG nova.network.neutron [None req-3078c700-ca6c-465a-82fb-b854d5091124 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1271.873885] env[67893]: DEBUG nova.compute.manager [None req-1b4a6e13-dc09-48fa-9689-59668a4684f3 tempest-ServersV294TestFqdnHostnames-1022455702 tempest-ServersV294TestFqdnHostnames-1022455702-project-member] [instance: 42938110-2d23-432a-bdb2-30750dac90b4] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1271.876695] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3078c700-ca6c-465a-82fb-b854d5091124 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Releasing lock "refresh_cache-5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1271.877070] env[67893]: DEBUG nova.compute.manager [None req-3078c700-ca6c-465a-82fb-b854d5091124 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1271.877264] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3078c700-ca6c-465a-82fb-b854d5091124 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1271.877862] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-9b82f24c-fe5a-4fb2-ba49-edab5064b796 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1271.886883] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b10e6160-9bea-490b-adbf-9f0b37039b89 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1271.897578] env[67893]: DEBUG nova.compute.manager [None req-1b4a6e13-dc09-48fa-9689-59668a4684f3 tempest-ServersV294TestFqdnHostnames-1022455702 tempest-ServersV294TestFqdnHostnames-1022455702-project-member] [instance: 42938110-2d23-432a-bdb2-30750dac90b4] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1271.916627] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-3078c700-ca6c-465a-82fb-b854d5091124 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff could not be found. [ 1271.916804] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3078c700-ca6c-465a-82fb-b854d5091124 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1271.916980] env[67893]: INFO nova.compute.manager [None req-3078c700-ca6c-465a-82fb-b854d5091124 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1271.917239] env[67893]: DEBUG oslo.service.loopingcall [None req-3078c700-ca6c-465a-82fb-b854d5091124 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1271.919172] env[67893]: DEBUG nova.compute.manager [-] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1271.919295] env[67893]: DEBUG nova.network.neutron [-] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1271.930185] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1b4a6e13-dc09-48fa-9689-59668a4684f3 tempest-ServersV294TestFqdnHostnames-1022455702 tempest-ServersV294TestFqdnHostnames-1022455702-project-member] Lock "42938110-2d23-432a-bdb2-30750dac90b4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 210.898s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1271.940421] env[67893]: DEBUG nova.compute.manager [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1272.011977] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1272.012257] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1272.014177] env[67893]: INFO nova.compute.claims [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1272.066546] env[67893]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67893) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1272.066781] env[67893]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-4947140a-3631-48a0-94d2-fb0723f0b478'] [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1272.067301] env[67893]: ERROR oslo.service.loopingcall [ 1272.069190] env[67893]: ERROR nova.compute.manager [None req-3078c700-ca6c-465a-82fb-b854d5091124 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1272.096549] env[67893]: ERROR nova.compute.manager [None req-3078c700-ca6c-465a-82fb-b854d5091124 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Traceback (most recent call last): [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] ret = obj(*args, **kwargs) [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] exception_handler_v20(status_code, error_body) [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] raise client_exc(message=error_message, [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Neutron server returns request_ids: ['req-4947140a-3631-48a0-94d2-fb0723f0b478'] [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] During handling of the above exception, another exception occurred: [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Traceback (most recent call last): [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] self._delete_instance(context, instance, bdms) [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] self._shutdown_instance(context, instance, bdms) [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] self._try_deallocate_network(context, instance, requested_networks) [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] with excutils.save_and_reraise_exception(): [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] self.force_reraise() [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] raise self.value [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] _deallocate_network_with_retries() [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] return evt.wait() [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] result = hub.switch() [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] return self.greenlet.switch() [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] result = func(*self.args, **self.kw) [ 1272.096549] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] result = f(*args, **kwargs) [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] self._deallocate_network( [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] self.network_api.deallocate_for_instance( [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] data = neutron.list_ports(**search_opts) [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] ret = obj(*args, **kwargs) [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] return self.list('ports', self.ports_path, retrieve_all, [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] ret = obj(*args, **kwargs) [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] for r in self._pagination(collection, path, **params): [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] res = self.get(path, params=params) [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] ret = obj(*args, **kwargs) [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] return self.retry_request("GET", action, body=body, [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] ret = obj(*args, **kwargs) [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] return self.do_request(method, action, body=body, [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] ret = obj(*args, **kwargs) [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] self._handle_fault_response(status_code, replybody, resp) [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1272.097951] env[67893]: ERROR nova.compute.manager [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] [ 1272.130155] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3078c700-ca6c-465a-82fb-b854d5091124 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Lock "5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.387s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1272.132042] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 81.322s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1272.132042] env[67893]: INFO nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] During sync_power_state the instance has a pending task (deleting). Skip. [ 1272.132042] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1272.186722] env[67893]: INFO nova.compute.manager [None req-3078c700-ca6c-465a-82fb-b854d5091124 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] [instance: 5cdef8c3-ecd4-4338-b2e6-a8910e3c52ff] Successfully reverted task state from None on failure for instance. [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server [None req-3078c700-ca6c-465a-82fb-b854d5091124 tempest-ServersAaction247Test-1572679610 tempest-ServersAaction247Test-1572679610-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-4947140a-3631-48a0-94d2-fb0723f0b478'] [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server raise self.value [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server raise self.value [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server raise self.value [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1272.192276] env[67893]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server raise self.value [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server raise self.value [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1272.194494] env[67893]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1272.196497] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1272.196497] env[67893]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1272.196497] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1272.196497] env[67893]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1272.196497] env[67893]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1272.196497] env[67893]: ERROR oslo_messaging.rpc.server [ 1272.332124] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b5087f8-4558-4b19-9969-5975c9f5bfe5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1272.341513] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d0ba7a6-cee0-488a-86bc-27b6f5574319 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1272.370834] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6abfd3dc-7f70-449e-abd4-11a3d811f059 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1272.378095] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f668b70-59ff-4ad9-914e-bea8f1cc5912 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1272.390627] env[67893]: DEBUG nova.compute.provider_tree [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1272.401049] env[67893]: DEBUG nova.scheduler.client.report [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1272.416071] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.403s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1272.416071] env[67893]: DEBUG nova.compute.manager [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1272.449407] env[67893]: DEBUG nova.compute.utils [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1272.451069] env[67893]: DEBUG nova.compute.manager [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1272.451246] env[67893]: DEBUG nova.network.neutron [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1272.461693] env[67893]: DEBUG nova.compute.manager [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1272.509907] env[67893]: DEBUG nova.policy [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dd097cb06c5a4348a0a98a7d2705d877', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5fc182b40fde498abb43dacf19eed124', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 1272.530994] env[67893]: DEBUG nova.compute.manager [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1272.561223] env[67893]: DEBUG nova.virt.hardware [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1272.561529] env[67893]: DEBUG nova.virt.hardware [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1272.561692] env[67893]: DEBUG nova.virt.hardware [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1272.561874] env[67893]: DEBUG nova.virt.hardware [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1272.562028] env[67893]: DEBUG nova.virt.hardware [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1272.562179] env[67893]: DEBUG nova.virt.hardware [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1272.562384] env[67893]: DEBUG nova.virt.hardware [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1272.562541] env[67893]: DEBUG nova.virt.hardware [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1272.562851] env[67893]: DEBUG nova.virt.hardware [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1272.563105] env[67893]: DEBUG nova.virt.hardware [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1272.563294] env[67893]: DEBUG nova.virt.hardware [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1272.564506] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6742b6cb-925c-4395-a085-b5f5dee744a9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1272.572911] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2995844-8cf3-40da-82c7-7abf64e08b27 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1272.883502] env[67893]: DEBUG nova.network.neutron [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Successfully created port: 15618ef1-8151-42a0-a212-bbd76935e941 {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1273.449249] env[67893]: DEBUG nova.compute.manager [req-e8ffa60a-f925-44fb-8f5c-f533e90aec40 req-e0e8a24f-f5ef-4167-a667-54ba0cea81bb service nova] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Received event network-vif-plugged-15618ef1-8151-42a0-a212-bbd76935e941 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1273.449478] env[67893]: DEBUG oslo_concurrency.lockutils [req-e8ffa60a-f925-44fb-8f5c-f533e90aec40 req-e0e8a24f-f5ef-4167-a667-54ba0cea81bb service nova] Acquiring lock "8dbbc2e6-9993-4bf0-b66b-6e685789221c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1273.449686] env[67893]: DEBUG oslo_concurrency.lockutils [req-e8ffa60a-f925-44fb-8f5c-f533e90aec40 req-e0e8a24f-f5ef-4167-a667-54ba0cea81bb service nova] Lock "8dbbc2e6-9993-4bf0-b66b-6e685789221c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1273.449855] env[67893]: DEBUG oslo_concurrency.lockutils [req-e8ffa60a-f925-44fb-8f5c-f533e90aec40 req-e0e8a24f-f5ef-4167-a667-54ba0cea81bb service nova] Lock "8dbbc2e6-9993-4bf0-b66b-6e685789221c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1273.450038] env[67893]: DEBUG nova.compute.manager [req-e8ffa60a-f925-44fb-8f5c-f533e90aec40 req-e0e8a24f-f5ef-4167-a667-54ba0cea81bb service nova] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] No waiting events found dispatching network-vif-plugged-15618ef1-8151-42a0-a212-bbd76935e941 {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1273.450210] env[67893]: WARNING nova.compute.manager [req-e8ffa60a-f925-44fb-8f5c-f533e90aec40 req-e0e8a24f-f5ef-4167-a667-54ba0cea81bb service nova] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Received unexpected event network-vif-plugged-15618ef1-8151-42a0-a212-bbd76935e941 for instance with vm_state building and task_state spawning. [ 1273.540882] env[67893]: DEBUG nova.network.neutron [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Successfully updated port: 15618ef1-8151-42a0-a212-bbd76935e941 {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1273.556079] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Acquiring lock "refresh_cache-8dbbc2e6-9993-4bf0-b66b-6e685789221c" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1273.556079] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Acquired lock "refresh_cache-8dbbc2e6-9993-4bf0-b66b-6e685789221c" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1273.557017] env[67893]: DEBUG nova.network.neutron [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1273.601732] env[67893]: DEBUG nova.network.neutron [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1273.811044] env[67893]: DEBUG nova.network.neutron [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Updating instance_info_cache with network_info: [{"id": "15618ef1-8151-42a0-a212-bbd76935e941", "address": "fa:16:3e:0f:14:70", "network": {"id": "a023126f-97b0-4ba2-b287-ea8176acba67", "bridge": "br-int", "label": "tempest-ImagesTestJSON-371591198-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5fc182b40fde498abb43dacf19eed124", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8c58d99d-ec12-4fc3-ab39-042b3f8cbb89", "external-id": "nsx-vlan-transportzone-44", "segmentation_id": 44, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap15618ef1-81", "ovs_interfaceid": "15618ef1-8151-42a0-a212-bbd76935e941", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1273.825478] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Releasing lock "refresh_cache-8dbbc2e6-9993-4bf0-b66b-6e685789221c" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1273.825842] env[67893]: DEBUG nova.compute.manager [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Instance network_info: |[{"id": "15618ef1-8151-42a0-a212-bbd76935e941", "address": "fa:16:3e:0f:14:70", "network": {"id": "a023126f-97b0-4ba2-b287-ea8176acba67", "bridge": "br-int", "label": "tempest-ImagesTestJSON-371591198-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5fc182b40fde498abb43dacf19eed124", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8c58d99d-ec12-4fc3-ab39-042b3f8cbb89", "external-id": "nsx-vlan-transportzone-44", "segmentation_id": 44, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap15618ef1-81", "ovs_interfaceid": "15618ef1-8151-42a0-a212-bbd76935e941", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1273.826259] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:0f:14:70', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8c58d99d-ec12-4fc3-ab39-042b3f8cbb89', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '15618ef1-8151-42a0-a212-bbd76935e941', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1273.833552] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Creating folder: Project (5fc182b40fde498abb43dacf19eed124). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1273.834090] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-52903b3a-370a-4291-90d9-200b5b16e3c9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1273.844740] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Created folder: Project (5fc182b40fde498abb43dacf19eed124) in parent group-v689771. [ 1273.844931] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Creating folder: Instances. Parent ref: group-v689843. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1273.845169] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-20b682c7-89b5-49b5-8a08-0f2b29580b5d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1273.853755] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Created folder: Instances in parent group-v689843. [ 1273.853977] env[67893]: DEBUG oslo.service.loopingcall [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1273.854279] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1273.854398] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1ceef3af-5708-4929-b103-7a32118e6697 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1273.872654] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1273.872654] env[67893]: value = "task-3455409" [ 1273.872654] env[67893]: _type = "Task" [ 1273.872654] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1273.881426] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455409, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1274.382834] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455409, 'name': CreateVM_Task, 'duration_secs': 0.287087} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1274.383171] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1274.469925] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1274.470131] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1274.470484] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1274.470920] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6d778d2d-cf57-44ca-8e34-4896dc9fcc25 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1274.478053] env[67893]: DEBUG oslo_vmware.api [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Waiting for the task: (returnval){ [ 1274.478053] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52f5d779-28f9-935b-df18-7e834bd67947" [ 1274.478053] env[67893]: _type = "Task" [ 1274.478053] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1274.484556] env[67893]: DEBUG oslo_vmware.api [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52f5d779-28f9-935b-df18-7e834bd67947, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1274.986385] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1274.986651] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1274.986864] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1275.479283] env[67893]: DEBUG nova.compute.manager [req-b93440f9-948c-4e37-bf0d-93f378a807aa req-03d7bc1c-bb27-40e8-a8b6-6deeacfc3f71 service nova] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Received event network-changed-15618ef1-8151-42a0-a212-bbd76935e941 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1275.479527] env[67893]: DEBUG nova.compute.manager [req-b93440f9-948c-4e37-bf0d-93f378a807aa req-03d7bc1c-bb27-40e8-a8b6-6deeacfc3f71 service nova] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Refreshing instance network info cache due to event network-changed-15618ef1-8151-42a0-a212-bbd76935e941. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1275.479743] env[67893]: DEBUG oslo_concurrency.lockutils [req-b93440f9-948c-4e37-bf0d-93f378a807aa req-03d7bc1c-bb27-40e8-a8b6-6deeacfc3f71 service nova] Acquiring lock "refresh_cache-8dbbc2e6-9993-4bf0-b66b-6e685789221c" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1275.479892] env[67893]: DEBUG oslo_concurrency.lockutils [req-b93440f9-948c-4e37-bf0d-93f378a807aa req-03d7bc1c-bb27-40e8-a8b6-6deeacfc3f71 service nova] Acquired lock "refresh_cache-8dbbc2e6-9993-4bf0-b66b-6e685789221c" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1275.480065] env[67893]: DEBUG nova.network.neutron [req-b93440f9-948c-4e37-bf0d-93f378a807aa req-03d7bc1c-bb27-40e8-a8b6-6deeacfc3f71 service nova] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Refreshing network info cache for port 15618ef1-8151-42a0-a212-bbd76935e941 {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1276.017297] env[67893]: DEBUG nova.network.neutron [req-b93440f9-948c-4e37-bf0d-93f378a807aa req-03d7bc1c-bb27-40e8-a8b6-6deeacfc3f71 service nova] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Updated VIF entry in instance network info cache for port 15618ef1-8151-42a0-a212-bbd76935e941. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1276.017696] env[67893]: DEBUG nova.network.neutron [req-b93440f9-948c-4e37-bf0d-93f378a807aa req-03d7bc1c-bb27-40e8-a8b6-6deeacfc3f71 service nova] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Updating instance_info_cache with network_info: [{"id": "15618ef1-8151-42a0-a212-bbd76935e941", "address": "fa:16:3e:0f:14:70", "network": {"id": "a023126f-97b0-4ba2-b287-ea8176acba67", "bridge": "br-int", "label": "tempest-ImagesTestJSON-371591198-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5fc182b40fde498abb43dacf19eed124", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8c58d99d-ec12-4fc3-ab39-042b3f8cbb89", "external-id": "nsx-vlan-transportzone-44", "segmentation_id": 44, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap15618ef1-81", "ovs_interfaceid": "15618ef1-8151-42a0-a212-bbd76935e941", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1276.029449] env[67893]: DEBUG oslo_concurrency.lockutils [req-b93440f9-948c-4e37-bf0d-93f378a807aa req-03d7bc1c-bb27-40e8-a8b6-6deeacfc3f71 service nova] Releasing lock "refresh_cache-8dbbc2e6-9993-4bf0-b66b-6e685789221c" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1276.267969] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1a07c534-7070-423e-b830-3460699b3392 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Acquiring lock "8dbbc2e6-9993-4bf0-b66b-6e685789221c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1288.123787] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Acquiring lock "41b5c5ec-936a-4abe-9db7-38d0d2aa371d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1288.124652] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Lock "41b5c5ec-936a-4abe-9db7-38d0d2aa371d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1288.161683] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Acquiring lock "81a6ba30-1d0d-4c4b-9aa1-e9af0cd82e0c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1288.162335] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Lock "81a6ba30-1d0d-4c4b-9aa1-e9af0cd82e0c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1288.189683] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Acquiring lock "14a8db1f-7820-4600-87f4-2788eac02c04" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1288.189683] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Lock "14a8db1f-7820-4600-87f4-2788eac02c04" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1292.111411] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1af0e92d-5776-48c6-a3d7-52cbf6e57208 tempest-MultipleCreateTestJSON-237684718 tempest-MultipleCreateTestJSON-237684718-project-member] Acquiring lock "676994ff-f4a9-4ea6-8ba7-a4f0ed04e63f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1292.112284] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1af0e92d-5776-48c6-a3d7-52cbf6e57208 tempest-MultipleCreateTestJSON-237684718 tempest-MultipleCreateTestJSON-237684718-project-member] Lock "676994ff-f4a9-4ea6-8ba7-a4f0ed04e63f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1292.135872] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1af0e92d-5776-48c6-a3d7-52cbf6e57208 tempest-MultipleCreateTestJSON-237684718 tempest-MultipleCreateTestJSON-237684718-project-member] Acquiring lock "cfd26f59-2527-4108-9765-9206ff27f4f3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1292.137198] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1af0e92d-5776-48c6-a3d7-52cbf6e57208 tempest-MultipleCreateTestJSON-237684718 tempest-MultipleCreateTestJSON-237684718-project-member] Lock "cfd26f59-2527-4108-9765-9206ff27f4f3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1294.290437] env[67893]: DEBUG oslo_concurrency.lockutils [None req-06f0052c-6ff2-43b2-b25c-a14e4c7a5bfa tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquiring lock "279f2b46-c95e-4c6e-a710-7dbfb9edddb5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1294.290721] env[67893]: DEBUG oslo_concurrency.lockutils [None req-06f0052c-6ff2-43b2-b25c-a14e4c7a5bfa tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Lock "279f2b46-c95e-4c6e-a710-7dbfb9edddb5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1300.734259] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1626100c-4f83-4623-aece-b62a80afe28d tempest-ServersNegativeTestJSON-1739007541 tempest-ServersNegativeTestJSON-1739007541-project-member] Acquiring lock "c5c75fd2-96be-49f6-9dcf-f6f2500c751f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1300.734558] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1626100c-4f83-4623-aece-b62a80afe28d tempest-ServersNegativeTestJSON-1739007541 tempest-ServersNegativeTestJSON-1739007541-project-member] Lock "c5c75fd2-96be-49f6-9dcf-f6f2500c751f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1301.622310] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1303.704178] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bd7a1dac-4382-4046-ae6d-a22429bad93e tempest-InstanceActionsV221TestJSON-844008283 tempest-InstanceActionsV221TestJSON-844008283-project-member] Acquiring lock "c81f4530-ceb3-4cd6-87b2-143ca3c3e5fb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1303.704488] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bd7a1dac-4382-4046-ae6d-a22429bad93e tempest-InstanceActionsV221TestJSON-844008283 tempest-InstanceActionsV221TestJSON-844008283-project-member] Lock "c81f4530-ceb3-4cd6-87b2-143ca3c3e5fb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1304.861078] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1304.861078] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1304.861078] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1304.882858] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1304.883378] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1304.883752] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1304.884146] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1304.884438] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1304.884727] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1304.885016] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1304.885329] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1304.885597] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1304.885850] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1304.886118] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1304.886718] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1306.858056] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1306.858568] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1307.854399] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1307.877045] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1307.877045] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1307.877045] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1307.877543] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1308.859421] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1308.871717] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1308.871971] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1308.872159] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1308.872479] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1308.873663] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da442726-f972-4da7-8ceb-83b06bf95bbf {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.882775] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a7d59f1-282d-4fc2-9df0-1da2fa2e118c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.898212] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28bbb899-f72f-449d-8820-79d7df6d65c4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.905093] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a819aaab-505f-4354-9b5a-ad2c2631e24b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.935609] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180990MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1308.935804] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1308.935962] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1309.022021] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1309.022021] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a9656a7e-8a7b-489e-9990-097c1e93e535 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1309.022021] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance fcae7119-6233-4a52-9e52-1147f2b10ddc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1309.022021] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2553f3c0-0988-4e11-a138-7e5f71e71f48 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1309.022021] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c05df6c1-e4c9-4276-9981-e80e584d540c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1309.022021] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5a24adaf-bced-4488-9ccb-fc996b2ba154 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1309.022021] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance efdb0a7e-403d-4de5-8c09-72b9c8f9cd79 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1309.022021] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5ede1991-efee-4c34-af5b-ce71f67456ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1309.022021] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance b3d31ca3-9a7a-49d0-955f-1e12808bf11f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1309.022021] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 8dbbc2e6-9993-4bf0-b66b-6e685789221c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1309.034297] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1309.047079] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 021f1a86-6015-4a22-b501-3ec9079edbec has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1309.058057] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 25d67f98-c132-434b-9d22-4569585527eb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1309.069200] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance bb9f69b8-d92d-4895-8115-0c436fd51367 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1309.079677] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 41b5c5ec-936a-4abe-9db7-38d0d2aa371d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1309.089553] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 81a6ba30-1d0d-4c4b-9aa1-e9af0cd82e0c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1309.100817] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 14a8db1f-7820-4600-87f4-2788eac02c04 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1309.111879] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 676994ff-f4a9-4ea6-8ba7-a4f0ed04e63f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1309.122273] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance cfd26f59-2527-4108-9765-9206ff27f4f3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1309.132981] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 279f2b46-c95e-4c6e-a710-7dbfb9edddb5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1309.143175] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c5c75fd2-96be-49f6-9dcf-f6f2500c751f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1309.153622] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c81f4530-ceb3-4cd6-87b2-143ca3c3e5fb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1309.153852] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1309.153997] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1309.386736] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f46cc6d-45df-4f63-89ba-7ee7bdbba152 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1309.394727] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9922daf4-af86-480d-a03b-b5ca35b0e574 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1309.423856] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75501fbe-bfcf-44f0-be3a-4c378afb1451 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1309.430935] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d79e6b50-aed9-4248-b257-5a6b20a2f0b0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1309.444233] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1309.452700] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1309.466422] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1309.466602] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.531s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1319.804783] env[67893]: WARNING oslo_vmware.rw_handles [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1319.804783] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1319.804783] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1319.804783] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1319.804783] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1319.804783] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 1319.804783] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1319.804783] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1319.804783] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1319.804783] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1319.804783] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1319.804783] env[67893]: ERROR oslo_vmware.rw_handles [ 1319.805525] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/7e89694c-f3be-48fc-a3d4-aa0a4160cbcc/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1319.807177] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1319.807421] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Copying Virtual Disk [datastore1] vmware_temp/7e89694c-f3be-48fc-a3d4-aa0a4160cbcc/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/7e89694c-f3be-48fc-a3d4-aa0a4160cbcc/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1319.807714] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-63c30ecb-8e6d-4fbf-bb74-10fb3cfcde3a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1319.816847] env[67893]: DEBUG oslo_vmware.api [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Waiting for the task: (returnval){ [ 1319.816847] env[67893]: value = "task-3455420" [ 1319.816847] env[67893]: _type = "Task" [ 1319.816847] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1319.825128] env[67893]: DEBUG oslo_vmware.api [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Task: {'id': task-3455420, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1320.327858] env[67893]: DEBUG oslo_vmware.exceptions [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1320.328136] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1320.328676] env[67893]: ERROR nova.compute.manager [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1320.328676] env[67893]: Faults: ['InvalidArgument'] [ 1320.328676] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Traceback (most recent call last): [ 1320.328676] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1320.328676] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] yield resources [ 1320.328676] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1320.328676] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] self.driver.spawn(context, instance, image_meta, [ 1320.328676] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1320.328676] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1320.328676] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1320.328676] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] self._fetch_image_if_missing(context, vi) [ 1320.328676] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1320.328676] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] image_cache(vi, tmp_image_ds_loc) [ 1320.328676] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1320.328676] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] vm_util.copy_virtual_disk( [ 1320.328676] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1320.328676] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] session._wait_for_task(vmdk_copy_task) [ 1320.328676] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1320.328676] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] return self.wait_for_task(task_ref) [ 1320.328676] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1320.328676] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] return evt.wait() [ 1320.328676] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1320.328676] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] result = hub.switch() [ 1320.328676] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1320.328676] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] return self.greenlet.switch() [ 1320.328676] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1320.328676] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] self.f(*self.args, **self.kw) [ 1320.328676] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1320.328676] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] raise exceptions.translate_fault(task_info.error) [ 1320.328676] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1320.328676] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Faults: ['InvalidArgument'] [ 1320.328676] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] [ 1320.329823] env[67893]: INFO nova.compute.manager [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Terminating instance [ 1320.330568] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1320.330849] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1320.331477] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8de74619-41ff-498a-8626-444a87fbe2ba {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1320.333542] env[67893]: DEBUG nova.compute.manager [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1320.333793] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1320.334467] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d220023a-6d1b-44d0-b8a4-d2e0c66c5b93 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1320.342119] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1320.343853] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1b2f8637-8bde-4297-9486-48d90c96a3d2 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1320.347705] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1320.347924] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1320.348704] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5ca6059a-fd8e-4061-8a3e-253915adab21 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1320.355116] env[67893]: DEBUG oslo_vmware.api [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Waiting for the task: (returnval){ [ 1320.355116] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]526e45d7-b449-1322-d81e-0cbe2c8cf30e" [ 1320.355116] env[67893]: _type = "Task" [ 1320.355116] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1320.371560] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1320.371787] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Creating directory with path [datastore1] vmware_temp/5f8e7df8-f653-4306-8552-4282b0828562/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1320.372068] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1af02b88-e2d6-4ae0-82f8-cadb7ad9a4c3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1320.400927] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Created directory with path [datastore1] vmware_temp/5f8e7df8-f653-4306-8552-4282b0828562/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1320.401113] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Fetch image to [datastore1] vmware_temp/5f8e7df8-f653-4306-8552-4282b0828562/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1320.401931] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/5f8e7df8-f653-4306-8552-4282b0828562/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1320.402077] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72a1ee6b-9777-4745-a8ff-62d052b91c94 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1320.410461] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42bc7082-ab8c-4e2e-bbfe-212f03e98f5d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1320.424098] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-214eb237-e049-4287-9207-b078a606498c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1320.428246] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1320.428458] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1320.428621] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Deleting the datastore file [datastore1] c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1320.428865] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a6e832e9-96c4-4be8-a75a-cdcac3791cae {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1320.460488] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d11772a-60fd-406d-8f9c-1255fe4903e6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1320.463237] env[67893]: DEBUG oslo_vmware.api [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Waiting for the task: (returnval){ [ 1320.463237] env[67893]: value = "task-3455422" [ 1320.463237] env[67893]: _type = "Task" [ 1320.463237] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1320.469019] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0c123c56-e25f-4e55-b5f8-a57370ee6a51 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1320.473939] env[67893]: DEBUG oslo_vmware.api [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Task: {'id': task-3455422, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1320.494565] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1320.544741] env[67893]: DEBUG oslo_vmware.rw_handles [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5f8e7df8-f653-4306-8552-4282b0828562/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1320.606057] env[67893]: DEBUG oslo_vmware.rw_handles [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1320.606272] env[67893]: DEBUG oslo_vmware.rw_handles [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5f8e7df8-f653-4306-8552-4282b0828562/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1320.973747] env[67893]: DEBUG oslo_vmware.api [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Task: {'id': task-3455422, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.215628} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1320.974053] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1320.974209] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1320.974391] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1320.974556] env[67893]: INFO nova.compute.manager [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Took 0.64 seconds to destroy the instance on the hypervisor. [ 1320.976846] env[67893]: DEBUG nova.compute.claims [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1320.977032] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1320.977253] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1321.022975] env[67893]: DEBUG oslo_concurrency.lockutils [None req-21b16e45-006f-42b0-a4b0-c03dc50846a4 tempest-ServerActionsV293TestJSON-1050756707 tempest-ServerActionsV293TestJSON-1050756707-project-member] Acquiring lock "1a903142-d9fc-41a2-b6db-9330ce2506bf" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1321.023325] env[67893]: DEBUG oslo_concurrency.lockutils [None req-21b16e45-006f-42b0-a4b0-c03dc50846a4 tempest-ServerActionsV293TestJSON-1050756707 tempest-ServerActionsV293TestJSON-1050756707-project-member] Lock "1a903142-d9fc-41a2-b6db-9330ce2506bf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1321.292948] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e841d8a-14b9-4101-8788-c38b6cc65aca {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1321.301021] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82315435-1fe9-4493-b27c-e214423bae92 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1321.330989] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16cc98c9-bf05-4fa6-bef5-142922120cef {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1321.337928] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6cf1144d-23fb-41cc-8ba4-e0818c51f423 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1321.350660] env[67893]: DEBUG nova.compute.provider_tree [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1321.360592] env[67893]: DEBUG nova.scheduler.client.report [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1321.374218] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.397s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1321.374717] env[67893]: ERROR nova.compute.manager [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1321.374717] env[67893]: Faults: ['InvalidArgument'] [ 1321.374717] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Traceback (most recent call last): [ 1321.374717] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1321.374717] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] self.driver.spawn(context, instance, image_meta, [ 1321.374717] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1321.374717] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1321.374717] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1321.374717] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] self._fetch_image_if_missing(context, vi) [ 1321.374717] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1321.374717] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] image_cache(vi, tmp_image_ds_loc) [ 1321.374717] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1321.374717] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] vm_util.copy_virtual_disk( [ 1321.374717] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1321.374717] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] session._wait_for_task(vmdk_copy_task) [ 1321.374717] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1321.374717] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] return self.wait_for_task(task_ref) [ 1321.374717] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1321.374717] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] return evt.wait() [ 1321.374717] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1321.374717] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] result = hub.switch() [ 1321.374717] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1321.374717] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] return self.greenlet.switch() [ 1321.374717] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1321.374717] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] self.f(*self.args, **self.kw) [ 1321.374717] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1321.374717] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] raise exceptions.translate_fault(task_info.error) [ 1321.374717] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1321.374717] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Faults: ['InvalidArgument'] [ 1321.374717] env[67893]: ERROR nova.compute.manager [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] [ 1321.376235] env[67893]: DEBUG nova.compute.utils [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1321.376811] env[67893]: DEBUG nova.compute.manager [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Build of instance c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e was re-scheduled: A specified parameter was not correct: fileType [ 1321.376811] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1321.377197] env[67893]: DEBUG nova.compute.manager [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1321.377370] env[67893]: DEBUG nova.compute.manager [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1321.377540] env[67893]: DEBUG nova.compute.manager [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1321.377703] env[67893]: DEBUG nova.network.neutron [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1321.708203] env[67893]: DEBUG nova.network.neutron [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1321.722439] env[67893]: INFO nova.compute.manager [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Took 0.34 seconds to deallocate network for instance. [ 1321.849274] env[67893]: INFO nova.scheduler.client.report [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Deleted allocations for instance c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e [ 1321.871799] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a1da38e8-d6e2-4cb8-8a53-7b8832dab86e tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Lock "c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 662.124s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1321.872987] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c8cc79dd-eeb7-4ac8-98fd-5a13cab7f00d tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Lock "c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 466.146s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1321.873212] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c8cc79dd-eeb7-4ac8-98fd-5a13cab7f00d tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Acquiring lock "c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1321.873415] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c8cc79dd-eeb7-4ac8-98fd-5a13cab7f00d tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Lock "c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1321.873605] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c8cc79dd-eeb7-4ac8-98fd-5a13cab7f00d tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Lock "c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1321.876578] env[67893]: INFO nova.compute.manager [None req-c8cc79dd-eeb7-4ac8-98fd-5a13cab7f00d tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Terminating instance [ 1321.878263] env[67893]: DEBUG nova.compute.manager [None req-c8cc79dd-eeb7-4ac8-98fd-5a13cab7f00d tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1321.878459] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-c8cc79dd-eeb7-4ac8-98fd-5a13cab7f00d tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1321.879036] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8ced53a6-a06e-4f05-ae44-326530fdb615 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1321.889112] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca7b0196-596e-41ee-9dc8-c1282a81cba9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1321.899868] env[67893]: DEBUG nova.compute.manager [None req-99a775e0-5cfa-4dd7-8504-164edc52d18a tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: 783b7968-c130-47f5-9ad3-459d0e7eb746] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1321.922341] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-c8cc79dd-eeb7-4ac8-98fd-5a13cab7f00d tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e could not be found. [ 1321.922550] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-c8cc79dd-eeb7-4ac8-98fd-5a13cab7f00d tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1321.922732] env[67893]: INFO nova.compute.manager [None req-c8cc79dd-eeb7-4ac8-98fd-5a13cab7f00d tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1321.922968] env[67893]: DEBUG oslo.service.loopingcall [None req-c8cc79dd-eeb7-4ac8-98fd-5a13cab7f00d tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1321.923204] env[67893]: DEBUG nova.compute.manager [-] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1321.923299] env[67893]: DEBUG nova.network.neutron [-] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1321.925707] env[67893]: DEBUG nova.compute.manager [None req-99a775e0-5cfa-4dd7-8504-164edc52d18a tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: 783b7968-c130-47f5-9ad3-459d0e7eb746] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1321.945235] env[67893]: DEBUG oslo_concurrency.lockutils [None req-99a775e0-5cfa-4dd7-8504-164edc52d18a tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Lock "783b7968-c130-47f5-9ad3-459d0e7eb746" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 237.248s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1321.952922] env[67893]: DEBUG nova.network.neutron [-] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1321.954221] env[67893]: DEBUG nova.compute.manager [None req-27231dd0-db90-480e-a5f8-adcc2d483328 tempest-VolumesAdminNegativeTest-1794643428 tempest-VolumesAdminNegativeTest-1794643428-project-member] [instance: 1591ce78-4293-4d03-be3f-a2cb552f51f7] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1321.959915] env[67893]: INFO nova.compute.manager [-] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] Took 0.04 seconds to deallocate network for instance. [ 1321.975615] env[67893]: DEBUG nova.compute.manager [None req-27231dd0-db90-480e-a5f8-adcc2d483328 tempest-VolumesAdminNegativeTest-1794643428 tempest-VolumesAdminNegativeTest-1794643428-project-member] [instance: 1591ce78-4293-4d03-be3f-a2cb552f51f7] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1321.994045] env[67893]: DEBUG oslo_concurrency.lockutils [None req-27231dd0-db90-480e-a5f8-adcc2d483328 tempest-VolumesAdminNegativeTest-1794643428 tempest-VolumesAdminNegativeTest-1794643428-project-member] Lock "1591ce78-4293-4d03-be3f-a2cb552f51f7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 229.670s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1322.003034] env[67893]: DEBUG nova.compute.manager [None req-5d0af375-942f-46af-ab55-f6c3b906a963 tempest-MultipleCreateTestJSON-237684718 tempest-MultipleCreateTestJSON-237684718-project-member] [instance: 46d7643f-00ab-4953-9a4c-e07b96615f2a] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1322.027672] env[67893]: DEBUG nova.compute.manager [None req-5d0af375-942f-46af-ab55-f6c3b906a963 tempest-MultipleCreateTestJSON-237684718 tempest-MultipleCreateTestJSON-237684718-project-member] [instance: 46d7643f-00ab-4953-9a4c-e07b96615f2a] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1322.044938] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c8cc79dd-eeb7-4ac8-98fd-5a13cab7f00d tempest-VolumesAssistedSnapshotsTest-1548932782 tempest-VolumesAssistedSnapshotsTest-1548932782-project-member] Lock "c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.172s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1322.045998] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 131.236s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1322.046198] env[67893]: INFO nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e] During sync_power_state the instance has a pending task (deleting). Skip. [ 1322.046437] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "c1bcede0-cb1a-4647-aed8-a77ed7eb2d8e" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1322.053117] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5d0af375-942f-46af-ab55-f6c3b906a963 tempest-MultipleCreateTestJSON-237684718 tempest-MultipleCreateTestJSON-237684718-project-member] Lock "46d7643f-00ab-4953-9a4c-e07b96615f2a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 227.743s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1322.060919] env[67893]: DEBUG nova.compute.manager [None req-5d0af375-942f-46af-ab55-f6c3b906a963 tempest-MultipleCreateTestJSON-237684718 tempest-MultipleCreateTestJSON-237684718-project-member] [instance: dd596db2-a53c-4609-a1da-6db1ec79846e] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1322.083727] env[67893]: DEBUG nova.compute.manager [None req-5d0af375-942f-46af-ab55-f6c3b906a963 tempest-MultipleCreateTestJSON-237684718 tempest-MultipleCreateTestJSON-237684718-project-member] [instance: dd596db2-a53c-4609-a1da-6db1ec79846e] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1322.106948] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5d0af375-942f-46af-ab55-f6c3b906a963 tempest-MultipleCreateTestJSON-237684718 tempest-MultipleCreateTestJSON-237684718-project-member] Lock "dd596db2-a53c-4609-a1da-6db1ec79846e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 227.761s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1322.115312] env[67893]: DEBUG nova.compute.manager [None req-8bf17f9f-a8b2-446e-96a0-b0ad0d8e23c6 tempest-SecurityGroupsTestJSON-756338800 tempest-SecurityGroupsTestJSON-756338800-project-member] [instance: 039d691f-31fe-4020-90aa-82905198e13d] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1322.138061] env[67893]: DEBUG nova.compute.manager [None req-8bf17f9f-a8b2-446e-96a0-b0ad0d8e23c6 tempest-SecurityGroupsTestJSON-756338800 tempest-SecurityGroupsTestJSON-756338800-project-member] [instance: 039d691f-31fe-4020-90aa-82905198e13d] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1322.160712] env[67893]: DEBUG oslo_concurrency.lockutils [None req-8bf17f9f-a8b2-446e-96a0-b0ad0d8e23c6 tempest-SecurityGroupsTestJSON-756338800 tempest-SecurityGroupsTestJSON-756338800-project-member] Lock "039d691f-31fe-4020-90aa-82905198e13d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 226.291s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1322.169825] env[67893]: DEBUG nova.compute.manager [None req-4549290e-89f7-4fe1-8a84-5564c0e9a898 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: f7bcf0fe-9569-4b61-be9e-c29f4116cb11] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1322.194074] env[67893]: DEBUG nova.compute.manager [None req-4549290e-89f7-4fe1-8a84-5564c0e9a898 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: f7bcf0fe-9569-4b61-be9e-c29f4116cb11] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1322.217243] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4549290e-89f7-4fe1-8a84-5564c0e9a898 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Lock "f7bcf0fe-9569-4b61-be9e-c29f4116cb11" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 226.344s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1322.227384] env[67893]: DEBUG nova.compute.manager [None req-872de797-89e9-4dc7-8428-9fa09a0d0f94 tempest-FloatingIPsAssociationNegativeTestJSON-1059429559 tempest-FloatingIPsAssociationNegativeTestJSON-1059429559-project-member] [instance: 0363316a-cf39-4741-baa9-a040d7486df2] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1322.254084] env[67893]: DEBUG nova.compute.manager [None req-872de797-89e9-4dc7-8428-9fa09a0d0f94 tempest-FloatingIPsAssociationNegativeTestJSON-1059429559 tempest-FloatingIPsAssociationNegativeTestJSON-1059429559-project-member] [instance: 0363316a-cf39-4741-baa9-a040d7486df2] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1322.274844] env[67893]: DEBUG oslo_concurrency.lockutils [None req-872de797-89e9-4dc7-8428-9fa09a0d0f94 tempest-FloatingIPsAssociationNegativeTestJSON-1059429559 tempest-FloatingIPsAssociationNegativeTestJSON-1059429559-project-member] Lock "0363316a-cf39-4741-baa9-a040d7486df2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 223.119s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1322.287040] env[67893]: DEBUG nova.compute.manager [None req-4d7b3cdf-2c75-4414-a5f5-2ff9671ab70c tempest-ServerShowV257Test-908506870 tempest-ServerShowV257Test-908506870-project-member] [instance: 9dce8f0a-8fbe-43a5-af0b-ab9f76055bef] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1322.312220] env[67893]: DEBUG nova.compute.manager [None req-4d7b3cdf-2c75-4414-a5f5-2ff9671ab70c tempest-ServerShowV257Test-908506870 tempest-ServerShowV257Test-908506870-project-member] [instance: 9dce8f0a-8fbe-43a5-af0b-ab9f76055bef] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1322.332342] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4d7b3cdf-2c75-4414-a5f5-2ff9671ab70c tempest-ServerShowV257Test-908506870 tempest-ServerShowV257Test-908506870-project-member] Lock "9dce8f0a-8fbe-43a5-af0b-ab9f76055bef" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 222.645s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1322.340825] env[67893]: DEBUG nova.compute.manager [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1322.393499] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1322.393762] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1322.395451] env[67893]: INFO nova.compute.claims [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1322.682279] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42883357-3a59-41dd-b46a-3daa69fefc7d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1322.690150] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1da14b37-17d2-4e75-91e5-38ab442f6601 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1322.720856] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6dd3cfe-fd3b-4df8-b1a8-1840e72c12bd {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1322.728791] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a065a275-c726-4f66-88fe-c3ff6d191d18 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1322.742464] env[67893]: DEBUG nova.compute.provider_tree [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1322.751361] env[67893]: DEBUG nova.scheduler.client.report [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1322.769953] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.376s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1322.770743] env[67893]: DEBUG nova.compute.manager [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1322.806062] env[67893]: DEBUG nova.compute.utils [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1322.807590] env[67893]: DEBUG nova.compute.manager [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1322.807771] env[67893]: DEBUG nova.network.neutron [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1322.816503] env[67893]: DEBUG nova.compute.manager [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1322.875035] env[67893]: DEBUG nova.policy [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2ec2807ecc5e41f88d55e41c7bacd19b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd2796e0cae2d4e9ab93bf514064409d2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 1322.884498] env[67893]: DEBUG nova.compute.manager [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1322.913896] env[67893]: DEBUG nova.virt.hardware [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1322.914278] env[67893]: DEBUG nova.virt.hardware [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1322.914471] env[67893]: DEBUG nova.virt.hardware [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1322.914706] env[67893]: DEBUG nova.virt.hardware [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1322.914892] env[67893]: DEBUG nova.virt.hardware [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1322.915084] env[67893]: DEBUG nova.virt.hardware [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1322.915411] env[67893]: DEBUG nova.virt.hardware [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1322.915817] env[67893]: DEBUG nova.virt.hardware [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1322.916072] env[67893]: DEBUG nova.virt.hardware [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1322.916357] env[67893]: DEBUG nova.virt.hardware [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1322.917254] env[67893]: DEBUG nova.virt.hardware [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1322.918484] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67492d00-b0d9-4058-b2ba-074ffe65ac75 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1322.926929] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fe80fae-6cba-4444-a58f-43f9a529ec51 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1323.227626] env[67893]: DEBUG nova.network.neutron [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Successfully created port: cc505466-ed90-43c7-814b-8ef71cad5e87 {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1323.882989] env[67893]: DEBUG nova.compute.manager [req-8b6ad89d-638e-4cee-a029-1d33a268c2a4 req-cc3b80ba-b97d-4831-8fa1-91d769628622 service nova] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Received event network-vif-plugged-cc505466-ed90-43c7-814b-8ef71cad5e87 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1323.883246] env[67893]: DEBUG oslo_concurrency.lockutils [req-8b6ad89d-638e-4cee-a029-1d33a268c2a4 req-cc3b80ba-b97d-4831-8fa1-91d769628622 service nova] Acquiring lock "1068cd1b-317e-42d5-b348-5bfdbb2b4dc0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1323.883498] env[67893]: DEBUG oslo_concurrency.lockutils [req-8b6ad89d-638e-4cee-a029-1d33a268c2a4 req-cc3b80ba-b97d-4831-8fa1-91d769628622 service nova] Lock "1068cd1b-317e-42d5-b348-5bfdbb2b4dc0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1323.883678] env[67893]: DEBUG oslo_concurrency.lockutils [req-8b6ad89d-638e-4cee-a029-1d33a268c2a4 req-cc3b80ba-b97d-4831-8fa1-91d769628622 service nova] Lock "1068cd1b-317e-42d5-b348-5bfdbb2b4dc0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1323.883845] env[67893]: DEBUG nova.compute.manager [req-8b6ad89d-638e-4cee-a029-1d33a268c2a4 req-cc3b80ba-b97d-4831-8fa1-91d769628622 service nova] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] No waiting events found dispatching network-vif-plugged-cc505466-ed90-43c7-814b-8ef71cad5e87 {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1323.884159] env[67893]: WARNING nova.compute.manager [req-8b6ad89d-638e-4cee-a029-1d33a268c2a4 req-cc3b80ba-b97d-4831-8fa1-91d769628622 service nova] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Received unexpected event network-vif-plugged-cc505466-ed90-43c7-814b-8ef71cad5e87 for instance with vm_state building and task_state spawning. [ 1324.020598] env[67893]: DEBUG nova.network.neutron [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Successfully updated port: cc505466-ed90-43c7-814b-8ef71cad5e87 {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1324.033053] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Acquiring lock "refresh_cache-1068cd1b-317e-42d5-b348-5bfdbb2b4dc0" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1324.038869] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Acquired lock "refresh_cache-1068cd1b-317e-42d5-b348-5bfdbb2b4dc0" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1324.038869] env[67893]: DEBUG nova.network.neutron [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1324.096085] env[67893]: DEBUG nova.network.neutron [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1324.266357] env[67893]: DEBUG nova.network.neutron [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Updating instance_info_cache with network_info: [{"id": "cc505466-ed90-43c7-814b-8ef71cad5e87", "address": "fa:16:3e:c1:ab:5c", "network": {"id": "bdf1beee-5146-4801-9442-0f17cb5dbd6e", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-487061069-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d2796e0cae2d4e9ab93bf514064409d2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1fb81f98-6f5a-47ab-a512-27277591d064", "external-id": "nsx-vlan-transportzone-624", "segmentation_id": 624, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcc505466-ed", "ovs_interfaceid": "cc505466-ed90-43c7-814b-8ef71cad5e87", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1324.277165] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Releasing lock "refresh_cache-1068cd1b-317e-42d5-b348-5bfdbb2b4dc0" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1324.277430] env[67893]: DEBUG nova.compute.manager [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Instance network_info: |[{"id": "cc505466-ed90-43c7-814b-8ef71cad5e87", "address": "fa:16:3e:c1:ab:5c", "network": {"id": "bdf1beee-5146-4801-9442-0f17cb5dbd6e", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-487061069-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d2796e0cae2d4e9ab93bf514064409d2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1fb81f98-6f5a-47ab-a512-27277591d064", "external-id": "nsx-vlan-transportzone-624", "segmentation_id": 624, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcc505466-ed", "ovs_interfaceid": "cc505466-ed90-43c7-814b-8ef71cad5e87", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1324.277908] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c1:ab:5c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '1fb81f98-6f5a-47ab-a512-27277591d064', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'cc505466-ed90-43c7-814b-8ef71cad5e87', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1324.285234] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Creating folder: Project (d2796e0cae2d4e9ab93bf514064409d2). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1324.285850] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d15036d2-9077-4960-8c54-60ba6955bc53 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1324.298175] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Created folder: Project (d2796e0cae2d4e9ab93bf514064409d2) in parent group-v689771. [ 1324.298381] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Creating folder: Instances. Parent ref: group-v689850. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1324.298671] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fb08f436-6cbd-4697-a1c3-782b02c4eb5f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1324.308011] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Created folder: Instances in parent group-v689850. [ 1324.308280] env[67893]: DEBUG oslo.service.loopingcall [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1324.308464] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1324.308659] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-039b2471-469a-42c9-a1d7-bf77dd5a5f04 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1324.326865] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1324.326865] env[67893]: value = "task-3455425" [ 1324.326865] env[67893]: _type = "Task" [ 1324.326865] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1324.334060] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455425, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1324.836847] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455425, 'name': CreateVM_Task} progress is 99%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1325.338489] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455425, 'name': CreateVM_Task} progress is 99%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1325.840047] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455425, 'name': CreateVM_Task, 'duration_secs': 1.303469} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1325.840047] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1325.840662] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1325.840808] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1325.841138] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1325.841385] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a0a19e5b-9e92-440d-8694-566407550ce8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1325.846224] env[67893]: DEBUG oslo_vmware.api [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Waiting for the task: (returnval){ [ 1325.846224] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52ede15c-0f18-c284-a937-547d2c0a536c" [ 1325.846224] env[67893]: _type = "Task" [ 1325.846224] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1325.854153] env[67893]: DEBUG oslo_vmware.api [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52ede15c-0f18-c284-a937-547d2c0a536c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1325.925106] env[67893]: DEBUG nova.compute.manager [req-3e35c219-a581-4a73-a8bd-2d32ca4babd7 req-8177fbe5-7f38-40ce-a8dc-d873812d62a6 service nova] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Received event network-changed-cc505466-ed90-43c7-814b-8ef71cad5e87 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1325.925323] env[67893]: DEBUG nova.compute.manager [req-3e35c219-a581-4a73-a8bd-2d32ca4babd7 req-8177fbe5-7f38-40ce-a8dc-d873812d62a6 service nova] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Refreshing instance network info cache due to event network-changed-cc505466-ed90-43c7-814b-8ef71cad5e87. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1325.925703] env[67893]: DEBUG oslo_concurrency.lockutils [req-3e35c219-a581-4a73-a8bd-2d32ca4babd7 req-8177fbe5-7f38-40ce-a8dc-d873812d62a6 service nova] Acquiring lock "refresh_cache-1068cd1b-317e-42d5-b348-5bfdbb2b4dc0" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1325.925881] env[67893]: DEBUG oslo_concurrency.lockutils [req-3e35c219-a581-4a73-a8bd-2d32ca4babd7 req-8177fbe5-7f38-40ce-a8dc-d873812d62a6 service nova] Acquired lock "refresh_cache-1068cd1b-317e-42d5-b348-5bfdbb2b4dc0" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1325.926074] env[67893]: DEBUG nova.network.neutron [req-3e35c219-a581-4a73-a8bd-2d32ca4babd7 req-8177fbe5-7f38-40ce-a8dc-d873812d62a6 service nova] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Refreshing network info cache for port cc505466-ed90-43c7-814b-8ef71cad5e87 {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1326.357871] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1326.357871] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1326.358097] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1326.398654] env[67893]: DEBUG nova.network.neutron [req-3e35c219-a581-4a73-a8bd-2d32ca4babd7 req-8177fbe5-7f38-40ce-a8dc-d873812d62a6 service nova] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Updated VIF entry in instance network info cache for port cc505466-ed90-43c7-814b-8ef71cad5e87. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1326.398993] env[67893]: DEBUG nova.network.neutron [req-3e35c219-a581-4a73-a8bd-2d32ca4babd7 req-8177fbe5-7f38-40ce-a8dc-d873812d62a6 service nova] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Updating instance_info_cache with network_info: [{"id": "cc505466-ed90-43c7-814b-8ef71cad5e87", "address": "fa:16:3e:c1:ab:5c", "network": {"id": "bdf1beee-5146-4801-9442-0f17cb5dbd6e", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-487061069-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d2796e0cae2d4e9ab93bf514064409d2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1fb81f98-6f5a-47ab-a512-27277591d064", "external-id": "nsx-vlan-transportzone-624", "segmentation_id": 624, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcc505466-ed", "ovs_interfaceid": "cc505466-ed90-43c7-814b-8ef71cad5e87", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1326.409232] env[67893]: DEBUG oslo_concurrency.lockutils [req-3e35c219-a581-4a73-a8bd-2d32ca4babd7 req-8177fbe5-7f38-40ce-a8dc-d873812d62a6 service nova] Releasing lock "refresh_cache-1068cd1b-317e-42d5-b348-5bfdbb2b4dc0" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1362.466645] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1364.859881] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1364.860228] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1364.860228] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1364.886334] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1364.886492] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1364.886623] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1364.886747] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1364.886869] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1364.887019] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1364.887215] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1364.887289] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1364.887401] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1364.887519] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1364.887637] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1366.858604] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1366.858885] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1367.858969] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1368.736977] env[67893]: WARNING oslo_vmware.rw_handles [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1368.736977] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1368.736977] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1368.736977] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1368.736977] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1368.736977] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 1368.736977] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1368.736977] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1368.736977] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1368.736977] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1368.736977] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1368.736977] env[67893]: ERROR oslo_vmware.rw_handles [ 1368.737390] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/5f8e7df8-f653-4306-8552-4282b0828562/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1368.739654] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1368.739961] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Copying Virtual Disk [datastore1] vmware_temp/5f8e7df8-f653-4306-8552-4282b0828562/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/5f8e7df8-f653-4306-8552-4282b0828562/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1368.740257] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4dd34a10-d876-4c56-89cc-f7fdc03fe93e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1368.748129] env[67893]: DEBUG oslo_vmware.api [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Waiting for the task: (returnval){ [ 1368.748129] env[67893]: value = "task-3455426" [ 1368.748129] env[67893]: _type = "Task" [ 1368.748129] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1368.756051] env[67893]: DEBUG oslo_vmware.api [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Task: {'id': task-3455426, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1368.853780] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1368.858432] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1368.859055] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1368.859055] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1369.257545] env[67893]: DEBUG oslo_vmware.exceptions [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1369.257821] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1369.258429] env[67893]: ERROR nova.compute.manager [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1369.258429] env[67893]: Faults: ['InvalidArgument'] [ 1369.258429] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Traceback (most recent call last): [ 1369.258429] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1369.258429] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] yield resources [ 1369.258429] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1369.258429] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] self.driver.spawn(context, instance, image_meta, [ 1369.258429] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1369.258429] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1369.258429] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1369.258429] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] self._fetch_image_if_missing(context, vi) [ 1369.258429] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1369.258429] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] image_cache(vi, tmp_image_ds_loc) [ 1369.258429] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1369.258429] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] vm_util.copy_virtual_disk( [ 1369.258429] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1369.258429] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] session._wait_for_task(vmdk_copy_task) [ 1369.258429] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1369.258429] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] return self.wait_for_task(task_ref) [ 1369.258429] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1369.258429] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] return evt.wait() [ 1369.258429] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1369.258429] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] result = hub.switch() [ 1369.258429] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1369.258429] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] return self.greenlet.switch() [ 1369.258429] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1369.258429] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] self.f(*self.args, **self.kw) [ 1369.258429] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1369.258429] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] raise exceptions.translate_fault(task_info.error) [ 1369.258429] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1369.258429] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Faults: ['InvalidArgument'] [ 1369.258429] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] [ 1369.259984] env[67893]: INFO nova.compute.manager [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Terminating instance [ 1369.260256] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1369.260460] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1369.260689] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-64a3115e-6dff-4c35-9c9e-b17a7020a3bf {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1369.263070] env[67893]: DEBUG nova.compute.manager [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1369.263261] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1369.263997] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abe108d9-9f77-40d9-9704-b364527af240 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1369.270306] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1369.270511] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b4893da4-ad95-449b-86c1-407464aa06a0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1369.272495] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1369.272667] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1369.273642] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-650c86af-89c7-4484-a99b-36744057fbf5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1369.278601] env[67893]: DEBUG oslo_vmware.api [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Waiting for the task: (returnval){ [ 1369.278601] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52a087c1-a05a-453e-dd75-cee31ac37f2b" [ 1369.278601] env[67893]: _type = "Task" [ 1369.278601] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1369.290089] env[67893]: DEBUG oslo_vmware.api [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52a087c1-a05a-453e-dd75-cee31ac37f2b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1369.401225] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1369.401351] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1369.402691] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Deleting the datastore file [datastore1] a9656a7e-8a7b-489e-9990-097c1e93e535 {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1369.402691] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e785be0d-705c-4100-ae03-19185939678b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1369.408036] env[67893]: DEBUG oslo_vmware.api [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Waiting for the task: (returnval){ [ 1369.408036] env[67893]: value = "task-3455428" [ 1369.408036] env[67893]: _type = "Task" [ 1369.408036] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1369.415283] env[67893]: DEBUG oslo_vmware.api [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Task: {'id': task-3455428, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1369.789415] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1369.789675] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Creating directory with path [datastore1] vmware_temp/7881f7c1-c886-43ac-873e-5d01406b0ecc/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1369.789905] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cf91c898-9562-47fe-b6e1-cd23cb9f58f8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1369.801067] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Created directory with path [datastore1] vmware_temp/7881f7c1-c886-43ac-873e-5d01406b0ecc/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1369.801280] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Fetch image to [datastore1] vmware_temp/7881f7c1-c886-43ac-873e-5d01406b0ecc/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1369.801451] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/7881f7c1-c886-43ac-873e-5d01406b0ecc/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1369.802179] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1846e533-6277-42dd-af56-648032cc04ea {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1369.809051] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-065d7645-a7ef-439f-8df8-2c36347fd864 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1369.817742] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f948670-1465-4779-8865-333b9271e4ac {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1369.848420] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8cb331d-dac1-42ff-a8ef-0bdc5e5c635e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1369.854216] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-54a41267-1ac6-4cb2-b3be-1d50f51c8dbf {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1369.874634] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1369.917116] env[67893]: DEBUG oslo_vmware.api [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Task: {'id': task-3455428, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.094235} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1369.917271] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1369.917456] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1369.917627] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1369.917794] env[67893]: INFO nova.compute.manager [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Took 0.65 seconds to destroy the instance on the hypervisor. [ 1369.919853] env[67893]: DEBUG nova.compute.claims [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1369.920034] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1369.920255] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1370.024278] env[67893]: DEBUG oslo_vmware.rw_handles [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7881f7c1-c886-43ac-873e-5d01406b0ecc/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1370.084615] env[67893]: DEBUG oslo_vmware.rw_handles [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1370.084851] env[67893]: DEBUG oslo_vmware.rw_handles [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7881f7c1-c886-43ac-873e-5d01406b0ecc/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1370.273072] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92d3fd2d-5ff6-4e6f-8fb1-8c0739f986a8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1370.281327] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b410a824-1cfa-488a-9363-09cdb342601c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1370.310738] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65309664-4e91-428f-bfa9-e752ed78a812 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1370.318123] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3fa344ed-9a22-4677-a2d2-1fc6c87c8aea {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1370.332246] env[67893]: DEBUG nova.compute.provider_tree [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1370.340819] env[67893]: DEBUG nova.scheduler.client.report [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1370.354973] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.434s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1370.355762] env[67893]: ERROR nova.compute.manager [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1370.355762] env[67893]: Faults: ['InvalidArgument'] [ 1370.355762] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Traceback (most recent call last): [ 1370.355762] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1370.355762] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] self.driver.spawn(context, instance, image_meta, [ 1370.355762] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1370.355762] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1370.355762] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1370.355762] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] self._fetch_image_if_missing(context, vi) [ 1370.355762] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1370.355762] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] image_cache(vi, tmp_image_ds_loc) [ 1370.355762] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1370.355762] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] vm_util.copy_virtual_disk( [ 1370.355762] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1370.355762] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] session._wait_for_task(vmdk_copy_task) [ 1370.355762] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1370.355762] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] return self.wait_for_task(task_ref) [ 1370.355762] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1370.355762] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] return evt.wait() [ 1370.355762] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1370.355762] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] result = hub.switch() [ 1370.355762] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1370.355762] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] return self.greenlet.switch() [ 1370.355762] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1370.355762] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] self.f(*self.args, **self.kw) [ 1370.355762] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1370.355762] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] raise exceptions.translate_fault(task_info.error) [ 1370.355762] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1370.355762] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Faults: ['InvalidArgument'] [ 1370.355762] env[67893]: ERROR nova.compute.manager [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] [ 1370.356748] env[67893]: DEBUG nova.compute.utils [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1370.358521] env[67893]: DEBUG nova.compute.manager [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Build of instance a9656a7e-8a7b-489e-9990-097c1e93e535 was re-scheduled: A specified parameter was not correct: fileType [ 1370.358521] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1370.358923] env[67893]: DEBUG nova.compute.manager [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1370.359100] env[67893]: DEBUG nova.compute.manager [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1370.359329] env[67893]: DEBUG nova.compute.manager [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1370.359432] env[67893]: DEBUG nova.network.neutron [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1370.858989] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1370.869186] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1370.869442] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1370.869622] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1370.869785] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1370.870904] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ecab467-1e42-462a-8c2d-e74f625680d4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1370.880051] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0589a7f-1e8c-4d71-b9f9-41caa87b6416 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1370.893822] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0075059b-f3b6-4d5a-a348-15f588101a06 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1370.902024] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8b0c11c-7e21-4cd5-8945-04d8bcad4548 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1370.932464] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180949MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1370.932632] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1370.932832] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1371.038812] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a9656a7e-8a7b-489e-9990-097c1e93e535 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1371.039095] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance fcae7119-6233-4a52-9e52-1147f2b10ddc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1371.039289] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2553f3c0-0988-4e11-a138-7e5f71e71f48 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1371.039469] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c05df6c1-e4c9-4276-9981-e80e584d540c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1371.039641] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5a24adaf-bced-4488-9ccb-fc996b2ba154 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1371.039813] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance efdb0a7e-403d-4de5-8c09-72b9c8f9cd79 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1371.039982] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5ede1991-efee-4c34-af5b-ce71f67456ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1371.040174] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance b3d31ca3-9a7a-49d0-955f-1e12808bf11f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1371.040347] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 8dbbc2e6-9993-4bf0-b66b-6e685789221c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1371.040509] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1371.053902] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 021f1a86-6015-4a22-b501-3ec9079edbec has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1371.063708] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 25d67f98-c132-434b-9d22-4569585527eb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1371.076806] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance bb9f69b8-d92d-4895-8115-0c436fd51367 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1371.092898] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 41b5c5ec-936a-4abe-9db7-38d0d2aa371d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1371.108085] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 81a6ba30-1d0d-4c4b-9aa1-e9af0cd82e0c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1371.118135] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 14a8db1f-7820-4600-87f4-2788eac02c04 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1371.129208] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 676994ff-f4a9-4ea6-8ba7-a4f0ed04e63f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1371.147437] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance cfd26f59-2527-4108-9765-9206ff27f4f3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1371.158588] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 279f2b46-c95e-4c6e-a710-7dbfb9edddb5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1371.170731] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c5c75fd2-96be-49f6-9dcf-f6f2500c751f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1371.181351] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c81f4530-ceb3-4cd6-87b2-143ca3c3e5fb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1371.192991] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 1a903142-d9fc-41a2-b6db-9330ce2506bf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1371.193215] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1371.193363] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1371.498060] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8017ccf5-c429-4516-9506-279ccf8efbac {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1371.506958] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-253582ea-8c06-4214-b77e-70958d2929d8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1371.542390] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fb80299-8e74-4637-b231-5d75d20d590c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1371.551759] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44cfba3c-4c35-4d6f-aaed-759f48ff2b25 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1371.566424] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1371.568170] env[67893]: DEBUG nova.network.neutron [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1371.575938] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1371.580153] env[67893]: INFO nova.compute.manager [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Took 1.22 seconds to deallocate network for instance. [ 1371.590551] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1371.590761] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.658s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1371.671917] env[67893]: INFO nova.scheduler.client.report [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Deleted allocations for instance a9656a7e-8a7b-489e-9990-097c1e93e535 [ 1371.694904] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5b70fed0-8bea-4585-b1ea-56b1da609bf5 tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Lock "a9656a7e-8a7b-489e-9990-097c1e93e535" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 687.657s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1371.696188] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3c3f1ce2-f468-46fe-8c53-10e731c88eef tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Lock "a9656a7e-8a7b-489e-9990-097c1e93e535" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 487.327s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1371.696398] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3c3f1ce2-f468-46fe-8c53-10e731c88eef tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Acquiring lock "a9656a7e-8a7b-489e-9990-097c1e93e535-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1371.696603] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3c3f1ce2-f468-46fe-8c53-10e731c88eef tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Lock "a9656a7e-8a7b-489e-9990-097c1e93e535-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1371.696765] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3c3f1ce2-f468-46fe-8c53-10e731c88eef tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Lock "a9656a7e-8a7b-489e-9990-097c1e93e535-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1371.698694] env[67893]: INFO nova.compute.manager [None req-3c3f1ce2-f468-46fe-8c53-10e731c88eef tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Terminating instance [ 1371.700246] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3c3f1ce2-f468-46fe-8c53-10e731c88eef tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Acquiring lock "refresh_cache-a9656a7e-8a7b-489e-9990-097c1e93e535" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1371.700401] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3c3f1ce2-f468-46fe-8c53-10e731c88eef tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Acquired lock "refresh_cache-a9656a7e-8a7b-489e-9990-097c1e93e535" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1371.700567] env[67893]: DEBUG nova.network.neutron [None req-3c3f1ce2-f468-46fe-8c53-10e731c88eef tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1371.715817] env[67893]: DEBUG nova.compute.manager [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1371.732846] env[67893]: DEBUG nova.network.neutron [None req-3c3f1ce2-f468-46fe-8c53-10e731c88eef tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1371.780859] env[67893]: DEBUG oslo_concurrency.lockutils [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1371.781161] env[67893]: DEBUG oslo_concurrency.lockutils [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1371.782795] env[67893]: INFO nova.compute.claims [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1371.886591] env[67893]: DEBUG nova.network.neutron [None req-3c3f1ce2-f468-46fe-8c53-10e731c88eef tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1371.896213] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3c3f1ce2-f468-46fe-8c53-10e731c88eef tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Releasing lock "refresh_cache-a9656a7e-8a7b-489e-9990-097c1e93e535" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1371.896730] env[67893]: DEBUG nova.compute.manager [None req-3c3f1ce2-f468-46fe-8c53-10e731c88eef tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1371.896926] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3c3f1ce2-f468-46fe-8c53-10e731c88eef tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1371.897482] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-19e5a29e-cff6-4045-9171-54fc44355439 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1371.909269] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63a23f5b-1d74-4f6d-8bb4-fe884ef1e185 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1371.940204] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-3c3f1ce2-f468-46fe-8c53-10e731c88eef tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a9656a7e-8a7b-489e-9990-097c1e93e535 could not be found. [ 1371.940413] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3c3f1ce2-f468-46fe-8c53-10e731c88eef tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1371.940586] env[67893]: INFO nova.compute.manager [None req-3c3f1ce2-f468-46fe-8c53-10e731c88eef tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1371.940819] env[67893]: DEBUG oslo.service.loopingcall [None req-3c3f1ce2-f468-46fe-8c53-10e731c88eef tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1371.943156] env[67893]: DEBUG nova.compute.manager [-] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1371.943260] env[67893]: DEBUG nova.network.neutron [-] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1371.959390] env[67893]: DEBUG nova.network.neutron [-] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1371.967666] env[67893]: DEBUG nova.network.neutron [-] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1371.977869] env[67893]: INFO nova.compute.manager [-] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] Took 0.03 seconds to deallocate network for instance. [ 1372.069854] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3c3f1ce2-f468-46fe-8c53-10e731c88eef tempest-ServersTestMultiNic-237832542 tempest-ServersTestMultiNic-237832542-project-member] Lock "a9656a7e-8a7b-489e-9990-097c1e93e535" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.374s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1372.070989] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "a9656a7e-8a7b-489e-9990-097c1e93e535" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 181.261s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1372.070989] env[67893]: INFO nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: a9656a7e-8a7b-489e-9990-097c1e93e535] During sync_power_state the instance has a pending task (deleting). Skip. [ 1372.071148] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "a9656a7e-8a7b-489e-9990-097c1e93e535" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1372.098613] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-961f3975-cb52-4dca-a122-2ad3a3bad26e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1372.108167] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-499d2989-7763-423d-a188-ad9fea878558 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1372.138627] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d9d8b9f-1224-434c-99e8-fe145b5d8580 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1372.146407] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a270f1f7-7b55-48a8-8761-69aef24b6315 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1372.159987] env[67893]: DEBUG nova.compute.provider_tree [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1372.170624] env[67893]: DEBUG nova.scheduler.client.report [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1372.186388] env[67893]: DEBUG oslo_concurrency.lockutils [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.405s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1372.186904] env[67893]: DEBUG nova.compute.manager [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1372.219115] env[67893]: DEBUG nova.compute.utils [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1372.220458] env[67893]: DEBUG nova.compute.manager [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1372.220632] env[67893]: DEBUG nova.network.neutron [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1372.228364] env[67893]: DEBUG nova.compute.manager [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1372.293020] env[67893]: DEBUG nova.compute.manager [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1372.307145] env[67893]: DEBUG nova.policy [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2a7cf83daef347089e10728559ab9d26', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '093685d267204cd99da54a398df3682b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 1372.321291] env[67893]: DEBUG nova.virt.hardware [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1372.321532] env[67893]: DEBUG nova.virt.hardware [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1372.321684] env[67893]: DEBUG nova.virt.hardware [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1372.321859] env[67893]: DEBUG nova.virt.hardware [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1372.322010] env[67893]: DEBUG nova.virt.hardware [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1372.322166] env[67893]: DEBUG nova.virt.hardware [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1372.322369] env[67893]: DEBUG nova.virt.hardware [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1372.322525] env[67893]: DEBUG nova.virt.hardware [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1372.322691] env[67893]: DEBUG nova.virt.hardware [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1372.322864] env[67893]: DEBUG nova.virt.hardware [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1372.323163] env[67893]: DEBUG nova.virt.hardware [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1372.324012] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f9deaba-38dc-4e0b-acff-fa35bd94b8d1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1372.332072] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f88df5a-ec2d-40e4-bc7d-c3817bda8405 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1372.654221] env[67893]: DEBUG nova.network.neutron [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Successfully created port: e7c5c5a2-4a8f-4ca6-8988-ba5364dbd6c0 {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1373.272245] env[67893]: DEBUG nova.network.neutron [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Successfully updated port: e7c5c5a2-4a8f-4ca6-8988-ba5364dbd6c0 {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1373.285283] env[67893]: DEBUG oslo_concurrency.lockutils [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Acquiring lock "refresh_cache-021f1a86-6015-4a22-b501-3ec9079edbec" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1373.285770] env[67893]: DEBUG oslo_concurrency.lockutils [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Acquired lock "refresh_cache-021f1a86-6015-4a22-b501-3ec9079edbec" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1373.285770] env[67893]: DEBUG nova.network.neutron [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1373.323326] env[67893]: DEBUG nova.network.neutron [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1373.521644] env[67893]: DEBUG nova.network.neutron [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Updating instance_info_cache with network_info: [{"id": "e7c5c5a2-4a8f-4ca6-8988-ba5364dbd6c0", "address": "fa:16:3e:c4:3d:a9", "network": {"id": "f56dd9da-3db9-478d-84c5-e0354db59d15", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-732765990-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "093685d267204cd99da54a398df3682b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape7c5c5a2-4a", "ovs_interfaceid": "e7c5c5a2-4a8f-4ca6-8988-ba5364dbd6c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1373.532113] env[67893]: DEBUG oslo_concurrency.lockutils [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Releasing lock "refresh_cache-021f1a86-6015-4a22-b501-3ec9079edbec" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1373.532395] env[67893]: DEBUG nova.compute.manager [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Instance network_info: |[{"id": "e7c5c5a2-4a8f-4ca6-8988-ba5364dbd6c0", "address": "fa:16:3e:c4:3d:a9", "network": {"id": "f56dd9da-3db9-478d-84c5-e0354db59d15", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-732765990-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "093685d267204cd99da54a398df3682b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape7c5c5a2-4a", "ovs_interfaceid": "e7c5c5a2-4a8f-4ca6-8988-ba5364dbd6c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1373.532756] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c4:3d:a9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '27138a4c-60c9-45fb-bf37-4c2f765315a3', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e7c5c5a2-4a8f-4ca6-8988-ba5364dbd6c0', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1373.540373] env[67893]: DEBUG oslo.service.loopingcall [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1373.540805] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1373.541041] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6c2eb553-c9b5-45f8-87b3-6ebadb635bba {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1373.561482] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1373.561482] env[67893]: value = "task-3455429" [ 1373.561482] env[67893]: _type = "Task" [ 1373.561482] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1373.568938] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455429, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1374.071765] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455429, 'name': CreateVM_Task, 'duration_secs': 0.295795} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1374.071930] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1374.072617] env[67893]: DEBUG oslo_concurrency.lockutils [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1374.072776] env[67893]: DEBUG oslo_concurrency.lockutils [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1374.073103] env[67893]: DEBUG oslo_concurrency.lockutils [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1374.073353] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e4003b81-2c4d-45a5-aa18-616c8af804a8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1374.077579] env[67893]: DEBUG oslo_vmware.api [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Waiting for the task: (returnval){ [ 1374.077579] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]522c07bd-998c-6e08-05dd-88bbb53b3837" [ 1374.077579] env[67893]: _type = "Task" [ 1374.077579] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1374.084621] env[67893]: DEBUG oslo_vmware.api [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]522c07bd-998c-6e08-05dd-88bbb53b3837, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1374.589118] env[67893]: DEBUG oslo_concurrency.lockutils [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1374.589423] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1374.589595] env[67893]: DEBUG oslo_concurrency.lockutils [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1375.035988] env[67893]: DEBUG nova.compute.manager [req-6d35bb78-cd4c-420a-8448-0430c468ad8e req-6d7a7bf4-cfcf-47ab-8d73-2ee1513e21b1 service nova] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Received event network-vif-plugged-e7c5c5a2-4a8f-4ca6-8988-ba5364dbd6c0 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1375.036377] env[67893]: DEBUG oslo_concurrency.lockutils [req-6d35bb78-cd4c-420a-8448-0430c468ad8e req-6d7a7bf4-cfcf-47ab-8d73-2ee1513e21b1 service nova] Acquiring lock "021f1a86-6015-4a22-b501-3ec9079edbec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1375.036628] env[67893]: DEBUG oslo_concurrency.lockutils [req-6d35bb78-cd4c-420a-8448-0430c468ad8e req-6d7a7bf4-cfcf-47ab-8d73-2ee1513e21b1 service nova] Lock "021f1a86-6015-4a22-b501-3ec9079edbec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1375.036831] env[67893]: DEBUG oslo_concurrency.lockutils [req-6d35bb78-cd4c-420a-8448-0430c468ad8e req-6d7a7bf4-cfcf-47ab-8d73-2ee1513e21b1 service nova] Lock "021f1a86-6015-4a22-b501-3ec9079edbec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1375.037041] env[67893]: DEBUG nova.compute.manager [req-6d35bb78-cd4c-420a-8448-0430c468ad8e req-6d7a7bf4-cfcf-47ab-8d73-2ee1513e21b1 service nova] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] No waiting events found dispatching network-vif-plugged-e7c5c5a2-4a8f-4ca6-8988-ba5364dbd6c0 {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1375.037255] env[67893]: WARNING nova.compute.manager [req-6d35bb78-cd4c-420a-8448-0430c468ad8e req-6d7a7bf4-cfcf-47ab-8d73-2ee1513e21b1 service nova] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Received unexpected event network-vif-plugged-e7c5c5a2-4a8f-4ca6-8988-ba5364dbd6c0 for instance with vm_state building and task_state spawning. [ 1375.037449] env[67893]: DEBUG nova.compute.manager [req-6d35bb78-cd4c-420a-8448-0430c468ad8e req-6d7a7bf4-cfcf-47ab-8d73-2ee1513e21b1 service nova] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Received event network-changed-e7c5c5a2-4a8f-4ca6-8988-ba5364dbd6c0 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1375.037633] env[67893]: DEBUG nova.compute.manager [req-6d35bb78-cd4c-420a-8448-0430c468ad8e req-6d7a7bf4-cfcf-47ab-8d73-2ee1513e21b1 service nova] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Refreshing instance network info cache due to event network-changed-e7c5c5a2-4a8f-4ca6-8988-ba5364dbd6c0. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1375.037847] env[67893]: DEBUG oslo_concurrency.lockutils [req-6d35bb78-cd4c-420a-8448-0430c468ad8e req-6d7a7bf4-cfcf-47ab-8d73-2ee1513e21b1 service nova] Acquiring lock "refresh_cache-021f1a86-6015-4a22-b501-3ec9079edbec" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1375.038019] env[67893]: DEBUG oslo_concurrency.lockutils [req-6d35bb78-cd4c-420a-8448-0430c468ad8e req-6d7a7bf4-cfcf-47ab-8d73-2ee1513e21b1 service nova] Acquired lock "refresh_cache-021f1a86-6015-4a22-b501-3ec9079edbec" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1375.038222] env[67893]: DEBUG nova.network.neutron [req-6d35bb78-cd4c-420a-8448-0430c468ad8e req-6d7a7bf4-cfcf-47ab-8d73-2ee1513e21b1 service nova] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Refreshing network info cache for port e7c5c5a2-4a8f-4ca6-8988-ba5364dbd6c0 {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1375.366899] env[67893]: DEBUG nova.network.neutron [req-6d35bb78-cd4c-420a-8448-0430c468ad8e req-6d7a7bf4-cfcf-47ab-8d73-2ee1513e21b1 service nova] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Updated VIF entry in instance network info cache for port e7c5c5a2-4a8f-4ca6-8988-ba5364dbd6c0. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1375.367186] env[67893]: DEBUG nova.network.neutron [req-6d35bb78-cd4c-420a-8448-0430c468ad8e req-6d7a7bf4-cfcf-47ab-8d73-2ee1513e21b1 service nova] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Updating instance_info_cache with network_info: [{"id": "e7c5c5a2-4a8f-4ca6-8988-ba5364dbd6c0", "address": "fa:16:3e:c4:3d:a9", "network": {"id": "f56dd9da-3db9-478d-84c5-e0354db59d15", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-732765990-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "093685d267204cd99da54a398df3682b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape7c5c5a2-4a", "ovs_interfaceid": "e7c5c5a2-4a8f-4ca6-8988-ba5364dbd6c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1375.377406] env[67893]: DEBUG oslo_concurrency.lockutils [req-6d35bb78-cd4c-420a-8448-0430c468ad8e req-6d7a7bf4-cfcf-47ab-8d73-2ee1513e21b1 service nova] Releasing lock "refresh_cache-021f1a86-6015-4a22-b501-3ec9079edbec" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1377.791444] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b52b7eb6-0937-4f37-9f1e-6f0a8f8bd897 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Acquiring lock "1068cd1b-317e-42d5-b348-5bfdbb2b4dc0" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1381.938735] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "e1849daf-3781-42ef-bede-267efbb652c9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1381.939325] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "e1849daf-3781-42ef-bede-267efbb652c9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1417.963013] env[67893]: WARNING oslo_vmware.rw_handles [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1417.963013] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1417.963013] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1417.963013] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1417.963013] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1417.963013] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 1417.963013] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1417.963013] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1417.963013] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1417.963013] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1417.963013] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1417.963013] env[67893]: ERROR oslo_vmware.rw_handles [ 1417.963621] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/7881f7c1-c886-43ac-873e-5d01406b0ecc/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1417.965553] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1417.965784] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Copying Virtual Disk [datastore1] vmware_temp/7881f7c1-c886-43ac-873e-5d01406b0ecc/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/7881f7c1-c886-43ac-873e-5d01406b0ecc/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1417.966094] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6bf5e691-82b7-4fcc-956d-55bbd1495646 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1417.974417] env[67893]: DEBUG oslo_vmware.api [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Waiting for the task: (returnval){ [ 1417.974417] env[67893]: value = "task-3455430" [ 1417.974417] env[67893]: _type = "Task" [ 1417.974417] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1417.982734] env[67893]: DEBUG oslo_vmware.api [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Task: {'id': task-3455430, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1418.484671] env[67893]: DEBUG oslo_vmware.exceptions [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1418.484954] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1418.485518] env[67893]: ERROR nova.compute.manager [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1418.485518] env[67893]: Faults: ['InvalidArgument'] [ 1418.485518] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Traceback (most recent call last): [ 1418.485518] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1418.485518] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] yield resources [ 1418.485518] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1418.485518] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] self.driver.spawn(context, instance, image_meta, [ 1418.485518] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1418.485518] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1418.485518] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1418.485518] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] self._fetch_image_if_missing(context, vi) [ 1418.485518] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1418.485518] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] image_cache(vi, tmp_image_ds_loc) [ 1418.485518] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1418.485518] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] vm_util.copy_virtual_disk( [ 1418.485518] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1418.485518] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] session._wait_for_task(vmdk_copy_task) [ 1418.485518] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1418.485518] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] return self.wait_for_task(task_ref) [ 1418.485518] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1418.485518] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] return evt.wait() [ 1418.485518] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1418.485518] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] result = hub.switch() [ 1418.485518] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1418.485518] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] return self.greenlet.switch() [ 1418.485518] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1418.485518] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] self.f(*self.args, **self.kw) [ 1418.485518] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1418.485518] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] raise exceptions.translate_fault(task_info.error) [ 1418.485518] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1418.485518] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Faults: ['InvalidArgument'] [ 1418.485518] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] [ 1418.486276] env[67893]: INFO nova.compute.manager [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Terminating instance [ 1418.487389] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1418.487595] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1418.487830] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f834b9b4-6716-4068-b472-7f2b2226a092 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1418.491078] env[67893]: DEBUG nova.compute.manager [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1418.491270] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1418.491987] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed1b3695-7b85-4311-8105-6453056f1a30 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1418.498722] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1418.498940] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-fd06a53c-b574-45f8-bdcf-e7b6715c6b8a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1418.500922] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1418.501113] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1418.502037] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3266d57c-eefc-48a7-a3eb-3b38d4b4d763 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1418.506663] env[67893]: DEBUG oslo_vmware.api [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Waiting for the task: (returnval){ [ 1418.506663] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52199563-ec06-441c-c5b0-88698a6f4b3e" [ 1418.506663] env[67893]: _type = "Task" [ 1418.506663] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1418.513331] env[67893]: DEBUG oslo_vmware.api [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52199563-ec06-441c-c5b0-88698a6f4b3e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1418.578058] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1418.578058] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1418.578226] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Deleting the datastore file [datastore1] fcae7119-6233-4a52-9e52-1147f2b10ddc {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1418.578433] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9ff49bcf-53ef-405a-b244-97dc60b35ad9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1418.585263] env[67893]: DEBUG oslo_vmware.api [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Waiting for the task: (returnval){ [ 1418.585263] env[67893]: value = "task-3455432" [ 1418.585263] env[67893]: _type = "Task" [ 1418.585263] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1418.592902] env[67893]: DEBUG oslo_vmware.api [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Task: {'id': task-3455432, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1419.016538] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1419.016837] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Creating directory with path [datastore1] vmware_temp/e9fb1031-b362-44f5-8229-af8b53acb218/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1419.017088] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4f9d2204-418c-4e77-824b-08f247a69247 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1419.027976] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Created directory with path [datastore1] vmware_temp/e9fb1031-b362-44f5-8229-af8b53acb218/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1419.028188] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Fetch image to [datastore1] vmware_temp/e9fb1031-b362-44f5-8229-af8b53acb218/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1419.028358] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/e9fb1031-b362-44f5-8229-af8b53acb218/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1419.029076] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc275191-9e2e-430a-8129-4bcc3fdbb028 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1419.035643] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e195fd9d-d08a-4f1f-b55a-e43fa310749f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1419.044249] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db05e3fb-6aa9-44ba-ad19-4e23ae35258e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1419.074927] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe9124b7-7722-482d-8302-4602da78c323 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1419.080201] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-cb281f82-aaf3-419c-b855-3e4fdd0eb2ac {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1419.093059] env[67893]: DEBUG oslo_vmware.api [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Task: {'id': task-3455432, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065326} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1419.093234] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1419.093418] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1419.093593] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1419.093771] env[67893]: INFO nova.compute.manager [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1419.095827] env[67893]: DEBUG nova.compute.claims [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1419.095998] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1419.096227] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1419.102309] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1419.154139] env[67893]: DEBUG oslo_vmware.rw_handles [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e9fb1031-b362-44f5-8229-af8b53acb218/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1419.214158] env[67893]: DEBUG oslo_vmware.rw_handles [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1419.214346] env[67893]: DEBUG oslo_vmware.rw_handles [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e9fb1031-b362-44f5-8229-af8b53acb218/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1419.436713] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-909a6f8b-b7d5-4eac-b46c-2a9337f38a58 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1419.445087] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec55a88a-bc8c-4099-94d7-4cd7b511fac9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1419.474319] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14572b32-9f89-4ada-88d8-8a01bf849b49 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1419.480963] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35022eb0-d980-42f5-af3a-9d0c6b0be1d6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1419.494259] env[67893]: DEBUG nova.compute.provider_tree [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1419.503253] env[67893]: DEBUG nova.scheduler.client.report [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1419.518780] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.422s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1419.519294] env[67893]: ERROR nova.compute.manager [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1419.519294] env[67893]: Faults: ['InvalidArgument'] [ 1419.519294] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Traceback (most recent call last): [ 1419.519294] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1419.519294] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] self.driver.spawn(context, instance, image_meta, [ 1419.519294] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1419.519294] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1419.519294] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1419.519294] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] self._fetch_image_if_missing(context, vi) [ 1419.519294] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1419.519294] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] image_cache(vi, tmp_image_ds_loc) [ 1419.519294] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1419.519294] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] vm_util.copy_virtual_disk( [ 1419.519294] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1419.519294] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] session._wait_for_task(vmdk_copy_task) [ 1419.519294] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1419.519294] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] return self.wait_for_task(task_ref) [ 1419.519294] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1419.519294] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] return evt.wait() [ 1419.519294] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1419.519294] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] result = hub.switch() [ 1419.519294] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1419.519294] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] return self.greenlet.switch() [ 1419.519294] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1419.519294] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] self.f(*self.args, **self.kw) [ 1419.519294] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1419.519294] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] raise exceptions.translate_fault(task_info.error) [ 1419.519294] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1419.519294] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Faults: ['InvalidArgument'] [ 1419.519294] env[67893]: ERROR nova.compute.manager [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] [ 1419.520139] env[67893]: DEBUG nova.compute.utils [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1419.521254] env[67893]: DEBUG nova.compute.manager [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Build of instance fcae7119-6233-4a52-9e52-1147f2b10ddc was re-scheduled: A specified parameter was not correct: fileType [ 1419.521254] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1419.521619] env[67893]: DEBUG nova.compute.manager [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1419.521789] env[67893]: DEBUG nova.compute.manager [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1419.521956] env[67893]: DEBUG nova.compute.manager [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1419.522135] env[67893]: DEBUG nova.network.neutron [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1419.834109] env[67893]: DEBUG nova.network.neutron [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1419.851664] env[67893]: INFO nova.compute.manager [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Took 0.33 seconds to deallocate network for instance. [ 1419.951601] env[67893]: INFO nova.scheduler.client.report [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Deleted allocations for instance fcae7119-6233-4a52-9e52-1147f2b10ddc [ 1419.976445] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4e0a76c3-9681-4fe1-892c-b4bd7e43cea0 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Lock "fcae7119-6233-4a52-9e52-1147f2b10ddc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 680.645s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1419.979160] env[67893]: DEBUG oslo_concurrency.lockutils [None req-11017f01-7cf4-413e-9b37-8dbf3743e9a2 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Lock "fcae7119-6233-4a52-9e52-1147f2b10ddc" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 484.631s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1419.979160] env[67893]: DEBUG oslo_concurrency.lockutils [None req-11017f01-7cf4-413e-9b37-8dbf3743e9a2 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Acquiring lock "fcae7119-6233-4a52-9e52-1147f2b10ddc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1419.979160] env[67893]: DEBUG oslo_concurrency.lockutils [None req-11017f01-7cf4-413e-9b37-8dbf3743e9a2 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Lock "fcae7119-6233-4a52-9e52-1147f2b10ddc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1419.979160] env[67893]: DEBUG oslo_concurrency.lockutils [None req-11017f01-7cf4-413e-9b37-8dbf3743e9a2 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Lock "fcae7119-6233-4a52-9e52-1147f2b10ddc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1419.981337] env[67893]: INFO nova.compute.manager [None req-11017f01-7cf4-413e-9b37-8dbf3743e9a2 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Terminating instance [ 1419.983060] env[67893]: DEBUG nova.compute.manager [None req-11017f01-7cf4-413e-9b37-8dbf3743e9a2 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1419.983251] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-11017f01-7cf4-413e-9b37-8dbf3743e9a2 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1419.983837] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6877ed34-e596-476e-8521-931bb186d45e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1419.997311] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9293240a-e3d8-42c6-901d-c5c7f96b5c9a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1420.008367] env[67893]: DEBUG nova.compute.manager [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1420.030401] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-11017f01-7cf4-413e-9b37-8dbf3743e9a2 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance fcae7119-6233-4a52-9e52-1147f2b10ddc could not be found. [ 1420.030676] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-11017f01-7cf4-413e-9b37-8dbf3743e9a2 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1420.030774] env[67893]: INFO nova.compute.manager [None req-11017f01-7cf4-413e-9b37-8dbf3743e9a2 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1420.031050] env[67893]: DEBUG oslo.service.loopingcall [None req-11017f01-7cf4-413e-9b37-8dbf3743e9a2 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1420.031349] env[67893]: DEBUG nova.compute.manager [-] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1420.031416] env[67893]: DEBUG nova.network.neutron [-] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1420.057346] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1420.057583] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1420.059112] env[67893]: INFO nova.compute.claims [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1420.068572] env[67893]: DEBUG nova.network.neutron [-] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1420.082943] env[67893]: INFO nova.compute.manager [-] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] Took 0.05 seconds to deallocate network for instance. [ 1420.168981] env[67893]: DEBUG oslo_concurrency.lockutils [None req-11017f01-7cf4-413e-9b37-8dbf3743e9a2 tempest-InstanceActionsNegativeTestJSON-1352342663 tempest-InstanceActionsNegativeTestJSON-1352342663-project-member] Lock "fcae7119-6233-4a52-9e52-1147f2b10ddc" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.191s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1420.170227] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "fcae7119-6233-4a52-9e52-1147f2b10ddc" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 229.360s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1420.170533] env[67893]: INFO nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: fcae7119-6233-4a52-9e52-1147f2b10ddc] During sync_power_state the instance has a pending task (deleting). Skip. [ 1420.170750] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "fcae7119-6233-4a52-9e52-1147f2b10ddc" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1420.337980] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c696e35b-da15-4576-bd88-93a0c5a9e617 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1420.345681] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8579f51f-d097-434b-8553-a0ca0da32440 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1420.374619] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-468b9b82-28d7-4a08-9a57-d43958627f91 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1420.381485] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1776dfd2-ed24-469f-8ad8-b9140ec7dde7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1420.394055] env[67893]: DEBUG nova.compute.provider_tree [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1420.402759] env[67893]: DEBUG nova.scheduler.client.report [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1420.416255] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.359s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1420.416781] env[67893]: DEBUG nova.compute.manager [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1420.451027] env[67893]: DEBUG nova.compute.utils [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1420.453029] env[67893]: DEBUG nova.compute.manager [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1420.453029] env[67893]: DEBUG nova.network.neutron [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1420.460495] env[67893]: DEBUG nova.compute.manager [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1420.521949] env[67893]: DEBUG nova.policy [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '125ce06a20be4a3aa82550cf33482bba', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ceacadba48b74fc3aeaf5968e3a9a0cd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 1420.525248] env[67893]: DEBUG nova.compute.manager [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1420.550388] env[67893]: DEBUG nova.virt.hardware [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1420.550665] env[67893]: DEBUG nova.virt.hardware [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1420.550825] env[67893]: DEBUG nova.virt.hardware [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1420.551013] env[67893]: DEBUG nova.virt.hardware [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1420.551169] env[67893]: DEBUG nova.virt.hardware [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1420.551315] env[67893]: DEBUG nova.virt.hardware [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1420.551515] env[67893]: DEBUG nova.virt.hardware [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1420.551671] env[67893]: DEBUG nova.virt.hardware [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1420.551870] env[67893]: DEBUG nova.virt.hardware [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1420.552064] env[67893]: DEBUG nova.virt.hardware [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1420.552241] env[67893]: DEBUG nova.virt.hardware [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1420.553089] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8f2bd38-962f-485c-9790-abb1571b0dea {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1420.562808] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-076daeb0-71c2-41b8-9609-9ea8e64d962d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1421.104721] env[67893]: DEBUG nova.network.neutron [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Successfully created port: 72d045c4-ce8d-49f3-a547-712079ff01a5 {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1421.730730] env[67893]: DEBUG nova.network.neutron [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Successfully updated port: 72d045c4-ce8d-49f3-a547-712079ff01a5 {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1421.752141] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Acquiring lock "refresh_cache-25d67f98-c132-434b-9d22-4569585527eb" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1421.752141] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Acquired lock "refresh_cache-25d67f98-c132-434b-9d22-4569585527eb" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1421.752141] env[67893]: DEBUG nova.network.neutron [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1421.791848] env[67893]: DEBUG nova.network.neutron [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1421.874574] env[67893]: DEBUG nova.compute.manager [req-2ece2d10-2e58-4048-99fd-11ec0625fcdd req-07667920-3bcc-4d3f-a2de-f729a3a38018 service nova] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Received event network-vif-plugged-72d045c4-ce8d-49f3-a547-712079ff01a5 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1421.874802] env[67893]: DEBUG oslo_concurrency.lockutils [req-2ece2d10-2e58-4048-99fd-11ec0625fcdd req-07667920-3bcc-4d3f-a2de-f729a3a38018 service nova] Acquiring lock "25d67f98-c132-434b-9d22-4569585527eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1421.875027] env[67893]: DEBUG oslo_concurrency.lockutils [req-2ece2d10-2e58-4048-99fd-11ec0625fcdd req-07667920-3bcc-4d3f-a2de-f729a3a38018 service nova] Lock "25d67f98-c132-434b-9d22-4569585527eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1421.875197] env[67893]: DEBUG oslo_concurrency.lockutils [req-2ece2d10-2e58-4048-99fd-11ec0625fcdd req-07667920-3bcc-4d3f-a2de-f729a3a38018 service nova] Lock "25d67f98-c132-434b-9d22-4569585527eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1421.875366] env[67893]: DEBUG nova.compute.manager [req-2ece2d10-2e58-4048-99fd-11ec0625fcdd req-07667920-3bcc-4d3f-a2de-f729a3a38018 service nova] [instance: 25d67f98-c132-434b-9d22-4569585527eb] No waiting events found dispatching network-vif-plugged-72d045c4-ce8d-49f3-a547-712079ff01a5 {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1421.875534] env[67893]: WARNING nova.compute.manager [req-2ece2d10-2e58-4048-99fd-11ec0625fcdd req-07667920-3bcc-4d3f-a2de-f729a3a38018 service nova] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Received unexpected event network-vif-plugged-72d045c4-ce8d-49f3-a547-712079ff01a5 for instance with vm_state building and task_state spawning. [ 1421.875889] env[67893]: DEBUG nova.compute.manager [req-2ece2d10-2e58-4048-99fd-11ec0625fcdd req-07667920-3bcc-4d3f-a2de-f729a3a38018 service nova] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Received event network-changed-72d045c4-ce8d-49f3-a547-712079ff01a5 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1421.875889] env[67893]: DEBUG nova.compute.manager [req-2ece2d10-2e58-4048-99fd-11ec0625fcdd req-07667920-3bcc-4d3f-a2de-f729a3a38018 service nova] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Refreshing instance network info cache due to event network-changed-72d045c4-ce8d-49f3-a547-712079ff01a5. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1421.876026] env[67893]: DEBUG oslo_concurrency.lockutils [req-2ece2d10-2e58-4048-99fd-11ec0625fcdd req-07667920-3bcc-4d3f-a2de-f729a3a38018 service nova] Acquiring lock "refresh_cache-25d67f98-c132-434b-9d22-4569585527eb" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1421.952664] env[67893]: DEBUG nova.network.neutron [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Updating instance_info_cache with network_info: [{"id": "72d045c4-ce8d-49f3-a547-712079ff01a5", "address": "fa:16:3e:3c:fc:0d", "network": {"id": "2e27f016-1dcd-4d7f-bcac-afbee70b1806", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-130882583-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ceacadba48b74fc3aeaf5968e3a9a0cd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4fb94adb-cc41-4c16-9830-a3205dbd2bf5", "external-id": "nsx-vlan-transportzone-100", "segmentation_id": 100, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap72d045c4-ce", "ovs_interfaceid": "72d045c4-ce8d-49f3-a547-712079ff01a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1421.965012] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Releasing lock "refresh_cache-25d67f98-c132-434b-9d22-4569585527eb" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1421.965322] env[67893]: DEBUG nova.compute.manager [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Instance network_info: |[{"id": "72d045c4-ce8d-49f3-a547-712079ff01a5", "address": "fa:16:3e:3c:fc:0d", "network": {"id": "2e27f016-1dcd-4d7f-bcac-afbee70b1806", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-130882583-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ceacadba48b74fc3aeaf5968e3a9a0cd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4fb94adb-cc41-4c16-9830-a3205dbd2bf5", "external-id": "nsx-vlan-transportzone-100", "segmentation_id": 100, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap72d045c4-ce", "ovs_interfaceid": "72d045c4-ce8d-49f3-a547-712079ff01a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1421.965614] env[67893]: DEBUG oslo_concurrency.lockutils [req-2ece2d10-2e58-4048-99fd-11ec0625fcdd req-07667920-3bcc-4d3f-a2de-f729a3a38018 service nova] Acquired lock "refresh_cache-25d67f98-c132-434b-9d22-4569585527eb" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1421.965793] env[67893]: DEBUG nova.network.neutron [req-2ece2d10-2e58-4048-99fd-11ec0625fcdd req-07667920-3bcc-4d3f-a2de-f729a3a38018 service nova] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Refreshing network info cache for port 72d045c4-ce8d-49f3-a547-712079ff01a5 {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1421.966843] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:3c:fc:0d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '4fb94adb-cc41-4c16-9830-a3205dbd2bf5', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '72d045c4-ce8d-49f3-a547-712079ff01a5', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1421.974476] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Creating folder: Project (ceacadba48b74fc3aeaf5968e3a9a0cd). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1421.975372] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9ad5feea-c5e6-4bf3-8299-44ba92df362f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1421.987425] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Created folder: Project (ceacadba48b74fc3aeaf5968e3a9a0cd) in parent group-v689771. [ 1421.987631] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Creating folder: Instances. Parent ref: group-v689854. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1421.988100] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-19d24d37-0912-4e34-bb0f-8813700f7d16 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1421.996321] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Created folder: Instances in parent group-v689854. [ 1421.996502] env[67893]: DEBUG oslo.service.loopingcall [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1421.996775] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1421.997029] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a2b09c45-f020-4570-be88-8e38af134ed9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1422.018052] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1422.018052] env[67893]: value = "task-3455435" [ 1422.018052] env[67893]: _type = "Task" [ 1422.018052] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1422.025779] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455435, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1422.337316] env[67893]: DEBUG nova.network.neutron [req-2ece2d10-2e58-4048-99fd-11ec0625fcdd req-07667920-3bcc-4d3f-a2de-f729a3a38018 service nova] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Updated VIF entry in instance network info cache for port 72d045c4-ce8d-49f3-a547-712079ff01a5. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1422.337696] env[67893]: DEBUG nova.network.neutron [req-2ece2d10-2e58-4048-99fd-11ec0625fcdd req-07667920-3bcc-4d3f-a2de-f729a3a38018 service nova] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Updating instance_info_cache with network_info: [{"id": "72d045c4-ce8d-49f3-a547-712079ff01a5", "address": "fa:16:3e:3c:fc:0d", "network": {"id": "2e27f016-1dcd-4d7f-bcac-afbee70b1806", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-130882583-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ceacadba48b74fc3aeaf5968e3a9a0cd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4fb94adb-cc41-4c16-9830-a3205dbd2bf5", "external-id": "nsx-vlan-transportzone-100", "segmentation_id": 100, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap72d045c4-ce", "ovs_interfaceid": "72d045c4-ce8d-49f3-a547-712079ff01a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1422.347360] env[67893]: DEBUG oslo_concurrency.lockutils [req-2ece2d10-2e58-4048-99fd-11ec0625fcdd req-07667920-3bcc-4d3f-a2de-f729a3a38018 service nova] Releasing lock "refresh_cache-25d67f98-c132-434b-9d22-4569585527eb" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1422.528496] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455435, 'name': CreateVM_Task, 'duration_secs': 0.302195} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1422.528725] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1422.529430] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1422.529594] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1422.529917] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1422.530232] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7be1e4fc-6a1f-4826-9ae5-8498ad9fae8d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1422.536017] env[67893]: DEBUG oslo_vmware.api [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Waiting for the task: (returnval){ [ 1422.536017] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52b6f881-f3f1-b2d5-21d7-a3faa71894e5" [ 1422.536017] env[67893]: _type = "Task" [ 1422.536017] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1422.542909] env[67893]: DEBUG oslo_vmware.api [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52b6f881-f3f1-b2d5-21d7-a3faa71894e5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1422.591307] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1423.045468] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1423.045724] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1423.045944] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1424.859472] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1424.859782] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1424.859782] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1424.882470] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1424.882630] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1424.882764] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1424.882889] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1424.883015] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1424.883147] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1424.883268] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1424.883386] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1424.883502] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1424.883618] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1424.883735] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1426.859037] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1428.854679] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1428.858288] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1428.858473] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1429.859240] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1429.859498] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1429.859622] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1431.854344] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1432.858620] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1432.870637] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1432.870867] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1432.871045] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1432.871206] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1432.872307] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfb06507-aed5-4f7e-84a3-d1af31c9615d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1432.881032] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f8dd8ba-402a-4f4d-abc5-d96bd47a5a0b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1432.895643] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2897a7e2-6474-4093-a8bc-6c5a139d44ef {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1432.900602] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21be46c6-5b70-476a-b120-4c9caf909658 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1432.928807] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180961MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1432.928963] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1432.929173] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1433.003259] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2553f3c0-0988-4e11-a138-7e5f71e71f48 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1433.003421] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c05df6c1-e4c9-4276-9981-e80e584d540c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1433.003550] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5a24adaf-bced-4488-9ccb-fc996b2ba154 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1433.003670] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance efdb0a7e-403d-4de5-8c09-72b9c8f9cd79 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1433.003790] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5ede1991-efee-4c34-af5b-ce71f67456ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1433.003905] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance b3d31ca3-9a7a-49d0-955f-1e12808bf11f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1433.004032] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 8dbbc2e6-9993-4bf0-b66b-6e685789221c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1433.004154] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1433.004270] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 021f1a86-6015-4a22-b501-3ec9079edbec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1433.004382] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 25d67f98-c132-434b-9d22-4569585527eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1433.014989] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance bb9f69b8-d92d-4895-8115-0c436fd51367 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1433.025067] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 41b5c5ec-936a-4abe-9db7-38d0d2aa371d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1433.034586] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 81a6ba30-1d0d-4c4b-9aa1-e9af0cd82e0c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1433.043548] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 14a8db1f-7820-4600-87f4-2788eac02c04 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1433.052053] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 676994ff-f4a9-4ea6-8ba7-a4f0ed04e63f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1433.060806] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance cfd26f59-2527-4108-9765-9206ff27f4f3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1433.071235] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 279f2b46-c95e-4c6e-a710-7dbfb9edddb5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1433.080224] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c5c75fd2-96be-49f6-9dcf-f6f2500c751f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1433.089451] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c81f4530-ceb3-4cd6-87b2-143ca3c3e5fb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1433.098799] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 1a903142-d9fc-41a2-b6db-9330ce2506bf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1433.107426] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance e1849daf-3781-42ef-bede-267efbb652c9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1433.107642] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1433.107795] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1433.340095] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-932ce01d-be3e-4b36-8d4a-e0a7368f103c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.348506] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6cc2601e-f6d1-4a0e-9f37-2c1fc69e3e61 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.377436] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01b0cb4f-8fdb-4ee5-b822-0dcbc16ff2c8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.384173] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db31c2af-52ce-4d3e-b1de-0228a08e0190 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.396680] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1433.406219] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1433.419943] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1433.420129] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.491s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1433.578568] env[67893]: DEBUG oslo_concurrency.lockutils [None req-8e476051-6816-46b8-a83e-a9a07d91f0c5 tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Acquiring lock "021f1a86-6015-4a22-b501-3ec9079edbec" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1437.111588] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquiring lock "2875b0a3-0213-4908-b86b-ce45a8901553" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1437.111886] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "2875b0a3-0213-4908-b86b-ce45a8901553" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1449.064116] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3670467c-6b56-49cb-bf6f-95380ba549d9 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Acquiring lock "25d67f98-c132-434b-9d22-4569585527eb" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1458.466033] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Acquiring lock "dfb92d1c-c2a5-49c1-8526-3743cb385c97" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1458.466033] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Lock "dfb92d1c-c2a5-49c1-8526-3743cb385c97" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1464.856798] env[67893]: WARNING oslo_vmware.rw_handles [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1464.856798] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1464.856798] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1464.856798] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1464.856798] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1464.856798] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 1464.856798] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1464.856798] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1464.856798] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1464.856798] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1464.856798] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1464.856798] env[67893]: ERROR oslo_vmware.rw_handles [ 1464.857561] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/e9fb1031-b362-44f5-8229-af8b53acb218/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1464.859530] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1464.859839] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Copying Virtual Disk [datastore1] vmware_temp/e9fb1031-b362-44f5-8229-af8b53acb218/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/e9fb1031-b362-44f5-8229-af8b53acb218/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1464.860167] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4aee0ad9-9586-4092-879e-251768187cee {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1464.867520] env[67893]: DEBUG oslo_vmware.api [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Waiting for the task: (returnval){ [ 1464.867520] env[67893]: value = "task-3455436" [ 1464.867520] env[67893]: _type = "Task" [ 1464.867520] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1464.875199] env[67893]: DEBUG oslo_vmware.api [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Task: {'id': task-3455436, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1465.378332] env[67893]: DEBUG oslo_vmware.exceptions [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1465.378608] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1465.379162] env[67893]: ERROR nova.compute.manager [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1465.379162] env[67893]: Faults: ['InvalidArgument'] [ 1465.379162] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Traceback (most recent call last): [ 1465.379162] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1465.379162] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] yield resources [ 1465.379162] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1465.379162] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] self.driver.spawn(context, instance, image_meta, [ 1465.379162] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1465.379162] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1465.379162] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1465.379162] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] self._fetch_image_if_missing(context, vi) [ 1465.379162] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1465.379162] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] image_cache(vi, tmp_image_ds_loc) [ 1465.379162] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1465.379162] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] vm_util.copy_virtual_disk( [ 1465.379162] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1465.379162] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] session._wait_for_task(vmdk_copy_task) [ 1465.379162] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1465.379162] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] return self.wait_for_task(task_ref) [ 1465.379162] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1465.379162] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] return evt.wait() [ 1465.379162] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1465.379162] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] result = hub.switch() [ 1465.379162] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1465.379162] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] return self.greenlet.switch() [ 1465.379162] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1465.379162] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] self.f(*self.args, **self.kw) [ 1465.379162] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1465.379162] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] raise exceptions.translate_fault(task_info.error) [ 1465.379162] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1465.379162] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Faults: ['InvalidArgument'] [ 1465.379162] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] [ 1465.380091] env[67893]: INFO nova.compute.manager [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Terminating instance [ 1465.381054] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1465.381263] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1465.381501] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-26bd4d96-c725-4eab-af00-09558e26bdeb {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.385107] env[67893]: DEBUG nova.compute.manager [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1465.385107] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1465.385578] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35e7c079-ada7-4c73-b88b-ec5babe13077 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.392457] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1465.392667] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-7c0111f7-84de-4aa8-afc7-10bd4dae5af5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.394833] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1465.395024] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1465.395989] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-27acd481-af05-458d-aaa4-2a2812fc0123 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.400534] env[67893]: DEBUG oslo_vmware.api [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Waiting for the task: (returnval){ [ 1465.400534] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52b60503-e369-bfc6-e4a4-e0e475b3a339" [ 1465.400534] env[67893]: _type = "Task" [ 1465.400534] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1465.408564] env[67893]: DEBUG oslo_vmware.api [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52b60503-e369-bfc6-e4a4-e0e475b3a339, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1465.470248] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1465.470465] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1465.470645] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Deleting the datastore file [datastore1] 2553f3c0-0988-4e11-a138-7e5f71e71f48 {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1465.470906] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f7094ed8-d87c-4cf6-b15a-dce1f1933aa8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.476363] env[67893]: DEBUG oslo_vmware.api [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Waiting for the task: (returnval){ [ 1465.476363] env[67893]: value = "task-3455438" [ 1465.476363] env[67893]: _type = "Task" [ 1465.476363] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1465.484248] env[67893]: DEBUG oslo_vmware.api [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Task: {'id': task-3455438, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1465.910786] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1465.911107] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Creating directory with path [datastore1] vmware_temp/4e9d65c9-f812-4889-9141-7519761096b2/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1465.911269] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-db472eef-d771-476c-a000-22c6a1a6228f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.921826] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Created directory with path [datastore1] vmware_temp/4e9d65c9-f812-4889-9141-7519761096b2/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1465.922011] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Fetch image to [datastore1] vmware_temp/4e9d65c9-f812-4889-9141-7519761096b2/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1465.922194] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/4e9d65c9-f812-4889-9141-7519761096b2/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1465.922860] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0aa4874-12c3-4ae3-accd-f8fa095453d6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.929095] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-686e3dad-cee5-4450-9a86-98cd30770737 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.938206] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6e78b7f-502d-435c-a72a-d78063d8eba5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.967246] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a504546-2e9b-4e65-a15c-dddf9c2b1a46 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.972400] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-d84e3669-8725-4a5a-8f76-9c348eb145c5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.983997] env[67893]: DEBUG oslo_vmware.api [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Task: {'id': task-3455438, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.071444} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1465.984221] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1465.984402] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1465.984573] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1465.984744] env[67893]: INFO nova.compute.manager [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1465.986888] env[67893]: DEBUG nova.compute.claims [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1465.987074] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1465.987299] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1465.992580] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1466.046104] env[67893]: DEBUG oslo_vmware.rw_handles [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4e9d65c9-f812-4889-9141-7519761096b2/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1466.105483] env[67893]: DEBUG oslo_vmware.rw_handles [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1466.105682] env[67893]: DEBUG oslo_vmware.rw_handles [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4e9d65c9-f812-4889-9141-7519761096b2/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1466.363021] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9b3d5f2-1569-480f-bf19-1ec37dc728a7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1466.370102] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32c68e0e-57b1-43db-b271-f069e419e457 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1467.054833] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8cffe3c-8a76-415b-8e01-4b4bb2171205 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1467.062679] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01fc6d70-62bf-4127-b784-0a36bba24059 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1467.075377] env[67893]: DEBUG nova.compute.provider_tree [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1467.083854] env[67893]: DEBUG nova.scheduler.client.report [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1467.096926] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 1.110s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1467.097564] env[67893]: ERROR nova.compute.manager [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1467.097564] env[67893]: Faults: ['InvalidArgument'] [ 1467.097564] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Traceback (most recent call last): [ 1467.097564] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1467.097564] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] self.driver.spawn(context, instance, image_meta, [ 1467.097564] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1467.097564] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1467.097564] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1467.097564] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] self._fetch_image_if_missing(context, vi) [ 1467.097564] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1467.097564] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] image_cache(vi, tmp_image_ds_loc) [ 1467.097564] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1467.097564] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] vm_util.copy_virtual_disk( [ 1467.097564] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1467.097564] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] session._wait_for_task(vmdk_copy_task) [ 1467.097564] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1467.097564] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] return self.wait_for_task(task_ref) [ 1467.097564] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1467.097564] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] return evt.wait() [ 1467.097564] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1467.097564] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] result = hub.switch() [ 1467.097564] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1467.097564] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] return self.greenlet.switch() [ 1467.097564] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1467.097564] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] self.f(*self.args, **self.kw) [ 1467.097564] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1467.097564] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] raise exceptions.translate_fault(task_info.error) [ 1467.097564] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1467.097564] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Faults: ['InvalidArgument'] [ 1467.097564] env[67893]: ERROR nova.compute.manager [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] [ 1467.099063] env[67893]: DEBUG nova.compute.utils [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1467.099683] env[67893]: DEBUG nova.compute.manager [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Build of instance 2553f3c0-0988-4e11-a138-7e5f71e71f48 was re-scheduled: A specified parameter was not correct: fileType [ 1467.099683] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1467.100066] env[67893]: DEBUG nova.compute.manager [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1467.100266] env[67893]: DEBUG nova.compute.manager [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1467.100409] env[67893]: DEBUG nova.compute.manager [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1467.100563] env[67893]: DEBUG nova.network.neutron [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1467.702284] env[67893]: DEBUG nova.network.neutron [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1467.713412] env[67893]: INFO nova.compute.manager [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Took 0.61 seconds to deallocate network for instance. [ 1467.822200] env[67893]: INFO nova.scheduler.client.report [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Deleted allocations for instance 2553f3c0-0988-4e11-a138-7e5f71e71f48 [ 1467.845974] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1042db68-4808-4df7-8cab-00ad57d6366e tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Lock "2553f3c0-0988-4e11-a138-7e5f71e71f48" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 625.979s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1467.847168] env[67893]: DEBUG oslo_concurrency.lockutils [None req-9f5c6c77-f230-4394-887b-bf3f4adfc47a tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Lock "2553f3c0-0988-4e11-a138-7e5f71e71f48" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 429.334s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1467.847449] env[67893]: DEBUG oslo_concurrency.lockutils [None req-9f5c6c77-f230-4394-887b-bf3f4adfc47a tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Acquiring lock "2553f3c0-0988-4e11-a138-7e5f71e71f48-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1467.847699] env[67893]: DEBUG oslo_concurrency.lockutils [None req-9f5c6c77-f230-4394-887b-bf3f4adfc47a tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Lock "2553f3c0-0988-4e11-a138-7e5f71e71f48-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1467.847854] env[67893]: DEBUG oslo_concurrency.lockutils [None req-9f5c6c77-f230-4394-887b-bf3f4adfc47a tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Lock "2553f3c0-0988-4e11-a138-7e5f71e71f48-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1467.849992] env[67893]: INFO nova.compute.manager [None req-9f5c6c77-f230-4394-887b-bf3f4adfc47a tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Terminating instance [ 1467.851715] env[67893]: DEBUG nova.compute.manager [None req-9f5c6c77-f230-4394-887b-bf3f4adfc47a tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1467.852404] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-9f5c6c77-f230-4394-887b-bf3f4adfc47a tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1467.852892] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-72200ef5-d3f5-4346-ba24-0ec938db1e7e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1467.857801] env[67893]: DEBUG nova.compute.manager [None req-37daa011-268e-415c-9465-50b3a80b115e tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: bb9f69b8-d92d-4895-8115-0c436fd51367] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1467.864099] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ca0e9e9-6941-4fe7-a2c4-38bf896b031b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1467.893791] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-9f5c6c77-f230-4394-887b-bf3f4adfc47a tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 2553f3c0-0988-4e11-a138-7e5f71e71f48 could not be found. [ 1467.894009] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-9f5c6c77-f230-4394-887b-bf3f4adfc47a tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1467.894192] env[67893]: INFO nova.compute.manager [None req-9f5c6c77-f230-4394-887b-bf3f4adfc47a tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1467.894436] env[67893]: DEBUG oslo.service.loopingcall [None req-9f5c6c77-f230-4394-887b-bf3f4adfc47a tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1467.894826] env[67893]: DEBUG nova.compute.manager [None req-37daa011-268e-415c-9465-50b3a80b115e tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: bb9f69b8-d92d-4895-8115-0c436fd51367] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1467.895693] env[67893]: DEBUG nova.compute.manager [-] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1467.895795] env[67893]: DEBUG nova.network.neutron [-] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1467.915732] env[67893]: DEBUG oslo_concurrency.lockutils [None req-37daa011-268e-415c-9465-50b3a80b115e tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Lock "bb9f69b8-d92d-4895-8115-0c436fd51367" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 211.630s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1467.924059] env[67893]: DEBUG nova.network.neutron [-] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1467.932044] env[67893]: DEBUG nova.compute.manager [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1467.934351] env[67893]: INFO nova.compute.manager [-] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] Took 0.04 seconds to deallocate network for instance. [ 1467.982137] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1467.982546] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1467.983899] env[67893]: INFO nova.compute.claims [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1468.018913] env[67893]: DEBUG oslo_concurrency.lockutils [None req-9f5c6c77-f230-4394-887b-bf3f4adfc47a tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Lock "2553f3c0-0988-4e11-a138-7e5f71e71f48" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.172s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1468.019910] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "2553f3c0-0988-4e11-a138-7e5f71e71f48" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 277.210s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1468.020107] env[67893]: INFO nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2553f3c0-0988-4e11-a138-7e5f71e71f48] During sync_power_state the instance has a pending task (deleting). Skip. [ 1468.020281] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "2553f3c0-0988-4e11-a138-7e5f71e71f48" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1468.265208] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48b6413c-1a22-4956-86d5-a68bb50cdb35 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1468.272803] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab1fa2f0-7da4-48e5-adc9-902ea2862524 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1468.301905] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64da390c-4c11-4be5-8bb8-d523ed3af52b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1468.308541] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec000058-6845-4d16-ac1c-eee8ab1fa672 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1468.323045] env[67893]: DEBUG nova.compute.provider_tree [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1468.331423] env[67893]: DEBUG nova.scheduler.client.report [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1468.349582] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.367s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1468.350089] env[67893]: DEBUG nova.compute.manager [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1468.381810] env[67893]: DEBUG nova.compute.utils [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1468.383408] env[67893]: DEBUG nova.compute.manager [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1468.383585] env[67893]: DEBUG nova.network.neutron [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1468.391912] env[67893]: DEBUG nova.compute.manager [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1468.452644] env[67893]: DEBUG nova.compute.manager [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1468.473929] env[67893]: DEBUG nova.policy [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '21e3dd090f084b6b9e40a2f44b57595a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1abdf94a18714c9381efb2df4e69cd60', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 1468.481090] env[67893]: DEBUG nova.virt.hardware [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1468.481090] env[67893]: DEBUG nova.virt.hardware [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1468.481090] env[67893]: DEBUG nova.virt.hardware [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1468.481090] env[67893]: DEBUG nova.virt.hardware [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1468.481090] env[67893]: DEBUG nova.virt.hardware [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1468.481090] env[67893]: DEBUG nova.virt.hardware [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1468.481090] env[67893]: DEBUG nova.virt.hardware [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1468.481090] env[67893]: DEBUG nova.virt.hardware [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1468.481090] env[67893]: DEBUG nova.virt.hardware [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1468.481090] env[67893]: DEBUG nova.virt.hardware [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1468.481090] env[67893]: DEBUG nova.virt.hardware [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1468.481531] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20978133-58e9-4903-9e26-e06006fc6aa7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1468.489715] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2a9f568-39bc-420f-ac0e-34101e70033d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1468.877980] env[67893]: DEBUG nova.network.neutron [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Successfully created port: 12b53d86-d190-41b1-a8c1-f10eaca72492 {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1469.455586] env[67893]: DEBUG nova.network.neutron [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Successfully updated port: 12b53d86-d190-41b1-a8c1-f10eaca72492 {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1469.466724] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Acquiring lock "refresh_cache-41b5c5ec-936a-4abe-9db7-38d0d2aa371d" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1469.466860] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Acquired lock "refresh_cache-41b5c5ec-936a-4abe-9db7-38d0d2aa371d" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1469.467025] env[67893]: DEBUG nova.network.neutron [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1469.505879] env[67893]: DEBUG nova.network.neutron [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1469.718991] env[67893]: DEBUG nova.network.neutron [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Updating instance_info_cache with network_info: [{"id": "12b53d86-d190-41b1-a8c1-f10eaca72492", "address": "fa:16:3e:8f:6d:34", "network": {"id": "beff743e-3906-484f-81d5-7241b465aa7c", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-814987567-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1abdf94a18714c9381efb2df4e69cd60", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0f096917-a0cf-4add-a9d2-23ca1c723b3b", "external-id": "nsx-vlan-transportzone-894", "segmentation_id": 894, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap12b53d86-d1", "ovs_interfaceid": "12b53d86-d190-41b1-a8c1-f10eaca72492", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1469.733457] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Releasing lock "refresh_cache-41b5c5ec-936a-4abe-9db7-38d0d2aa371d" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1469.733768] env[67893]: DEBUG nova.compute.manager [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Instance network_info: |[{"id": "12b53d86-d190-41b1-a8c1-f10eaca72492", "address": "fa:16:3e:8f:6d:34", "network": {"id": "beff743e-3906-484f-81d5-7241b465aa7c", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-814987567-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1abdf94a18714c9381efb2df4e69cd60", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0f096917-a0cf-4add-a9d2-23ca1c723b3b", "external-id": "nsx-vlan-transportzone-894", "segmentation_id": 894, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap12b53d86-d1", "ovs_interfaceid": "12b53d86-d190-41b1-a8c1-f10eaca72492", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1469.734182] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:8f:6d:34', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0f096917-a0cf-4add-a9d2-23ca1c723b3b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '12b53d86-d190-41b1-a8c1-f10eaca72492', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1469.741708] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Creating folder: Project (1abdf94a18714c9381efb2df4e69cd60). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1469.742263] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c1624b5d-1c85-4861-abff-c5ae7a0cb922 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1469.752774] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Created folder: Project (1abdf94a18714c9381efb2df4e69cd60) in parent group-v689771. [ 1469.752968] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Creating folder: Instances. Parent ref: group-v689857. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1469.753190] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6f9e3e29-f354-4be5-895d-dc27050d4a9b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1469.762559] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Created folder: Instances in parent group-v689857. [ 1469.762829] env[67893]: DEBUG oslo.service.loopingcall [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1469.763067] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1469.763304] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-819eaf1b-decc-4a87-8abf-078883f2c9c7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1469.783391] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1469.783391] env[67893]: value = "task-3455441" [ 1469.783391] env[67893]: _type = "Task" [ 1469.783391] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1469.790498] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455441, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1469.827629] env[67893]: DEBUG nova.compute.manager [req-074558e9-b6c2-43bf-b067-80a301d74f5f req-6107f260-20ba-4995-8e97-9745efadd81d service nova] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Received event network-vif-plugged-12b53d86-d190-41b1-a8c1-f10eaca72492 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1469.827805] env[67893]: DEBUG oslo_concurrency.lockutils [req-074558e9-b6c2-43bf-b067-80a301d74f5f req-6107f260-20ba-4995-8e97-9745efadd81d service nova] Acquiring lock "41b5c5ec-936a-4abe-9db7-38d0d2aa371d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1469.828035] env[67893]: DEBUG oslo_concurrency.lockutils [req-074558e9-b6c2-43bf-b067-80a301d74f5f req-6107f260-20ba-4995-8e97-9745efadd81d service nova] Lock "41b5c5ec-936a-4abe-9db7-38d0d2aa371d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1469.828356] env[67893]: DEBUG oslo_concurrency.lockutils [req-074558e9-b6c2-43bf-b067-80a301d74f5f req-6107f260-20ba-4995-8e97-9745efadd81d service nova] Lock "41b5c5ec-936a-4abe-9db7-38d0d2aa371d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1469.828547] env[67893]: DEBUG nova.compute.manager [req-074558e9-b6c2-43bf-b067-80a301d74f5f req-6107f260-20ba-4995-8e97-9745efadd81d service nova] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] No waiting events found dispatching network-vif-plugged-12b53d86-d190-41b1-a8c1-f10eaca72492 {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1469.828710] env[67893]: WARNING nova.compute.manager [req-074558e9-b6c2-43bf-b067-80a301d74f5f req-6107f260-20ba-4995-8e97-9745efadd81d service nova] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Received unexpected event network-vif-plugged-12b53d86-d190-41b1-a8c1-f10eaca72492 for instance with vm_state building and task_state spawning. [ 1469.828870] env[67893]: DEBUG nova.compute.manager [req-074558e9-b6c2-43bf-b067-80a301d74f5f req-6107f260-20ba-4995-8e97-9745efadd81d service nova] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Received event network-changed-12b53d86-d190-41b1-a8c1-f10eaca72492 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1469.829038] env[67893]: DEBUG nova.compute.manager [req-074558e9-b6c2-43bf-b067-80a301d74f5f req-6107f260-20ba-4995-8e97-9745efadd81d service nova] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Refreshing instance network info cache due to event network-changed-12b53d86-d190-41b1-a8c1-f10eaca72492. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1469.829227] env[67893]: DEBUG oslo_concurrency.lockutils [req-074558e9-b6c2-43bf-b067-80a301d74f5f req-6107f260-20ba-4995-8e97-9745efadd81d service nova] Acquiring lock "refresh_cache-41b5c5ec-936a-4abe-9db7-38d0d2aa371d" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1469.829363] env[67893]: DEBUG oslo_concurrency.lockutils [req-074558e9-b6c2-43bf-b067-80a301d74f5f req-6107f260-20ba-4995-8e97-9745efadd81d service nova] Acquired lock "refresh_cache-41b5c5ec-936a-4abe-9db7-38d0d2aa371d" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1469.829541] env[67893]: DEBUG nova.network.neutron [req-074558e9-b6c2-43bf-b067-80a301d74f5f req-6107f260-20ba-4995-8e97-9745efadd81d service nova] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Refreshing network info cache for port 12b53d86-d190-41b1-a8c1-f10eaca72492 {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1470.131765] env[67893]: DEBUG nova.network.neutron [req-074558e9-b6c2-43bf-b067-80a301d74f5f req-6107f260-20ba-4995-8e97-9745efadd81d service nova] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Updated VIF entry in instance network info cache for port 12b53d86-d190-41b1-a8c1-f10eaca72492. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1470.132137] env[67893]: DEBUG nova.network.neutron [req-074558e9-b6c2-43bf-b067-80a301d74f5f req-6107f260-20ba-4995-8e97-9745efadd81d service nova] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Updating instance_info_cache with network_info: [{"id": "12b53d86-d190-41b1-a8c1-f10eaca72492", "address": "fa:16:3e:8f:6d:34", "network": {"id": "beff743e-3906-484f-81d5-7241b465aa7c", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-814987567-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1abdf94a18714c9381efb2df4e69cd60", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0f096917-a0cf-4add-a9d2-23ca1c723b3b", "external-id": "nsx-vlan-transportzone-894", "segmentation_id": 894, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap12b53d86-d1", "ovs_interfaceid": "12b53d86-d190-41b1-a8c1-f10eaca72492", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1470.141694] env[67893]: DEBUG oslo_concurrency.lockutils [req-074558e9-b6c2-43bf-b067-80a301d74f5f req-6107f260-20ba-4995-8e97-9745efadd81d service nova] Releasing lock "refresh_cache-41b5c5ec-936a-4abe-9db7-38d0d2aa371d" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1470.293601] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455441, 'name': CreateVM_Task} progress is 25%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1470.794179] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455441, 'name': CreateVM_Task} progress is 25%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1471.295062] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455441, 'name': CreateVM_Task, 'duration_secs': 1.231} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1471.295238] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1471.295882] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1471.296057] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1471.296372] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1471.296612] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a8fcbc31-d75e-409b-8bc9-f6d8f2c9ecb9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1471.300929] env[67893]: DEBUG oslo_vmware.api [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Waiting for the task: (returnval){ [ 1471.300929] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52e921c7-1cb8-d476-2cfe-0bd4203b9d29" [ 1471.300929] env[67893]: _type = "Task" [ 1471.300929] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1471.309038] env[67893]: DEBUG oslo_vmware.api [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52e921c7-1cb8-d476-2cfe-0bd4203b9d29, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1471.811435] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1471.811704] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1471.811866] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1472.530745] env[67893]: DEBUG oslo_concurrency.lockutils [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Acquiring lock "ad60df35-54c0-459e-8a25-981922ae0a88" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1472.530980] env[67893]: DEBUG oslo_concurrency.lockutils [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Lock "ad60df35-54c0-459e-8a25-981922ae0a88" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1478.859569] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1478.859896] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Cleaning up deleted instances with incomplete migration {{(pid=67893) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1481.867989] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1483.961470] env[67893]: DEBUG oslo_concurrency.lockutils [None req-339d37c8-dfe3-418f-874d-9642ef9b5005 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Acquiring lock "41b5c5ec-936a-4abe-9db7-38d0d2aa371d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1486.859391] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1486.859718] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1486.859718] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1486.882245] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1486.882437] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1486.882535] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1486.882662] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1486.882785] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1486.882905] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1486.883034] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1486.883157] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1486.883275] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1486.883392] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1486.883508] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1486.884009] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1487.859553] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1487.859904] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Cleaning up deleted instances {{(pid=67893) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1487.868739] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] There are 0 instances to clean {{(pid=67893) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1488.864206] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1488.864540] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1489.859653] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1489.859897] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1489.860062] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1490.637610] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a0e599db-ae9e-41ec-9dd4-8ad767b54843 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquiring lock "b2c40a66-699c-4185-8ffa-85dbfc4463c5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1490.637917] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a0e599db-ae9e-41ec-9dd4-8ad767b54843 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Lock "b2c40a66-699c-4185-8ffa-85dbfc4463c5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1490.859528] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1493.858921] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1493.870546] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1493.870743] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1493.870905] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1493.871073] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1493.872151] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-add87abd-4902-4b59-b29f-552b29d17c70 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1493.880817] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3d2e08b-4675-439d-8d1c-cef99aba0e57 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1493.894205] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8273d71-7ffc-4f5d-8bcc-05cc6aab2367 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1493.900246] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-664d163f-7f58-4d7b-9731-b56a72d2ecde {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1493.928271] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180936MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1493.928410] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1493.928578] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1494.097771] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c05df6c1-e4c9-4276-9981-e80e584d540c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1494.097979] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5a24adaf-bced-4488-9ccb-fc996b2ba154 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1494.098132] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance efdb0a7e-403d-4de5-8c09-72b9c8f9cd79 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1494.098263] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5ede1991-efee-4c34-af5b-ce71f67456ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1494.098385] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance b3d31ca3-9a7a-49d0-955f-1e12808bf11f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1494.098507] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 8dbbc2e6-9993-4bf0-b66b-6e685789221c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1494.098623] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1494.098737] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 021f1a86-6015-4a22-b501-3ec9079edbec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1494.098850] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 25d67f98-c132-434b-9d22-4569585527eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1494.098959] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 41b5c5ec-936a-4abe-9db7-38d0d2aa371d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1494.111741] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c5c75fd2-96be-49f6-9dcf-f6f2500c751f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1494.122327] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance c81f4530-ceb3-4cd6-87b2-143ca3c3e5fb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1494.133544] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 1a903142-d9fc-41a2-b6db-9330ce2506bf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1494.143669] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance e1849daf-3781-42ef-bede-267efbb652c9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1494.152841] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2875b0a3-0213-4908-b86b-ce45a8901553 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1494.162491] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance dfb92d1c-c2a5-49c1-8526-3743cb385c97 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1494.171441] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance ad60df35-54c0-459e-8a25-981922ae0a88 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1494.181954] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance b2c40a66-699c-4185-8ffa-85dbfc4463c5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1494.181954] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1494.181954] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1494.197207] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Refreshing inventories for resource provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1494.210916] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Updating ProviderTree inventory for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1494.211111] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Updating inventory in ProviderTree for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1494.221715] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Refreshing aggregate associations for resource provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57, aggregates: None {{(pid=67893) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1494.238740] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Refreshing trait associations for resource provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO {{(pid=67893) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1494.431308] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56750989-cf80-468e-bbfd-07388715ecab {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1494.439050] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0979424-fc28-4a47-9ef0-1714fcebff3d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1494.468431] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab6edc11-9bb3-44d8-acbd-4a69fa534558 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1494.475515] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-123f12d5-bc3f-4207-9bf6-589bad629329 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1494.488304] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1494.496513] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1494.511483] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1494.511483] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.582s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1494.511483] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1508.610829] env[67893]: DEBUG oslo_concurrency.lockutils [None req-fb4f8be8-000d-4eeb-b7f2-de178a5536ea tempest-ServersNegativeTestMultiTenantJSON-1847586781 tempest-ServersNegativeTestMultiTenantJSON-1847586781-project-member] Acquiring lock "1c6116c0-84c2-40bd-84d2-bf1f4a5b9a10" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1508.610829] env[67893]: DEBUG oslo_concurrency.lockutils [None req-fb4f8be8-000d-4eeb-b7f2-de178a5536ea tempest-ServersNegativeTestMultiTenantJSON-1847586781 tempest-ServersNegativeTestMultiTenantJSON-1847586781-project-member] Lock "1c6116c0-84c2-40bd-84d2-bf1f4a5b9a10" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1514.877031] env[67893]: WARNING oslo_vmware.rw_handles [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1514.877031] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1514.877031] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1514.877031] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1514.877031] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1514.877031] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 1514.877031] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1514.877031] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1514.877031] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1514.877031] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1514.877031] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1514.877031] env[67893]: ERROR oslo_vmware.rw_handles [ 1514.877031] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/4e9d65c9-f812-4889-9141-7519761096b2/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1514.879285] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1514.879532] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Copying Virtual Disk [datastore1] vmware_temp/4e9d65c9-f812-4889-9141-7519761096b2/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/4e9d65c9-f812-4889-9141-7519761096b2/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1514.879811] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-84c89b58-1312-464d-af8f-61e875667547 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1514.888392] env[67893]: DEBUG oslo_vmware.api [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Waiting for the task: (returnval){ [ 1514.888392] env[67893]: value = "task-3455442" [ 1514.888392] env[67893]: _type = "Task" [ 1514.888392] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1514.896983] env[67893]: DEBUG oslo_vmware.api [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Task: {'id': task-3455442, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1515.400100] env[67893]: DEBUG oslo_vmware.exceptions [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1515.400100] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1515.400291] env[67893]: ERROR nova.compute.manager [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1515.400291] env[67893]: Faults: ['InvalidArgument'] [ 1515.400291] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Traceback (most recent call last): [ 1515.400291] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1515.400291] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] yield resources [ 1515.400291] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1515.400291] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] self.driver.spawn(context, instance, image_meta, [ 1515.400291] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1515.400291] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1515.400291] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1515.400291] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] self._fetch_image_if_missing(context, vi) [ 1515.400291] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1515.400291] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] image_cache(vi, tmp_image_ds_loc) [ 1515.400291] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1515.400291] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] vm_util.copy_virtual_disk( [ 1515.400291] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1515.400291] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] session._wait_for_task(vmdk_copy_task) [ 1515.400291] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1515.400291] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] return self.wait_for_task(task_ref) [ 1515.400291] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1515.400291] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] return evt.wait() [ 1515.400291] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1515.400291] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] result = hub.switch() [ 1515.400291] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1515.400291] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] return self.greenlet.switch() [ 1515.400291] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1515.400291] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] self.f(*self.args, **self.kw) [ 1515.400291] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1515.400291] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] raise exceptions.translate_fault(task_info.error) [ 1515.400291] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1515.400291] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Faults: ['InvalidArgument'] [ 1515.400291] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] [ 1515.401315] env[67893]: INFO nova.compute.manager [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Terminating instance [ 1515.402045] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1515.402257] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1515.402864] env[67893]: DEBUG nova.compute.manager [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1515.403062] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1515.403286] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8b8d1431-2d3f-4976-ac2d-5ff36089655e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1515.405718] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c187e1e5-095c-4ec5-84ff-5796af974e12 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1515.412682] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1515.412887] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-77e74d18-4fde-47cc-b003-42b25604a204 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1515.414926] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1515.415112] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1515.416091] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-77ca5e35-7d63-4df3-a7ca-aa70d8895d90 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1515.421122] env[67893]: DEBUG oslo_vmware.api [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Waiting for the task: (returnval){ [ 1515.421122] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]5267934b-0b2b-56aa-2334-5337c597bb5f" [ 1515.421122] env[67893]: _type = "Task" [ 1515.421122] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1515.435176] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1515.435399] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Creating directory with path [datastore1] vmware_temp/9bc9d6ef-811f-434b-b75d-35439e66de95/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1515.435609] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7db5d857-6bc4-48ea-8d2c-7b56e68d2e12 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1515.455621] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Created directory with path [datastore1] vmware_temp/9bc9d6ef-811f-434b-b75d-35439e66de95/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1515.455831] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Fetch image to [datastore1] vmware_temp/9bc9d6ef-811f-434b-b75d-35439e66de95/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1515.456010] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/9bc9d6ef-811f-434b-b75d-35439e66de95/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1515.456768] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27aba39f-7679-4748-8a5e-35d5c252533c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1515.463940] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79ab1c5d-a2a6-48ad-b40a-b58747f3f569 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1515.473089] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc4968f6-d86f-4d0c-9521-b74cef2401fd {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1515.478485] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1515.478699] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1515.478874] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Deleting the datastore file [datastore1] c05df6c1-e4c9-4276-9981-e80e584d540c {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1515.479110] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f8f9a1a1-e3a3-441d-b490-f9af25116add {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1515.506974] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-931608f3-63af-4c0b-b86f-a0af55ec2e9d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1515.510902] env[67893]: DEBUG oslo_vmware.api [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Waiting for the task: (returnval){ [ 1515.510902] env[67893]: value = "task-3455444" [ 1515.510902] env[67893]: _type = "Task" [ 1515.510902] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1515.515288] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8c54b585-b67f-42ce-b002-d74af6d11ca3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1515.519628] env[67893]: DEBUG oslo_vmware.api [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Task: {'id': task-3455444, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1515.537456] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1515.678458] env[67893]: DEBUG oslo_vmware.rw_handles [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9bc9d6ef-811f-434b-b75d-35439e66de95/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1515.738907] env[67893]: DEBUG oslo_vmware.rw_handles [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1515.739115] env[67893]: DEBUG oslo_vmware.rw_handles [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9bc9d6ef-811f-434b-b75d-35439e66de95/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1516.020850] env[67893]: DEBUG oslo_vmware.api [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Task: {'id': task-3455444, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073633} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1516.022047] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1516.022047] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1516.022047] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1516.022047] env[67893]: INFO nova.compute.manager [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1516.024079] env[67893]: DEBUG nova.compute.claims [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1516.024255] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1516.024464] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1516.272441] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9523ed0a-bce5-4984-8380-d17cfa61882d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1516.279958] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-887eeb5a-0e24-4c3c-b8f1-e354a6cfcd28 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1516.308375] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-096738e5-5237-4ec0-a1e0-3343e7bf5324 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1516.315446] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb226a5f-9ab0-4d31-a1d1-54f92adf1f19 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1516.329054] env[67893]: DEBUG nova.compute.provider_tree [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1516.337053] env[67893]: DEBUG nova.scheduler.client.report [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1516.351396] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.327s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1516.351928] env[67893]: ERROR nova.compute.manager [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1516.351928] env[67893]: Faults: ['InvalidArgument'] [ 1516.351928] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Traceback (most recent call last): [ 1516.351928] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1516.351928] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] self.driver.spawn(context, instance, image_meta, [ 1516.351928] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1516.351928] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1516.351928] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1516.351928] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] self._fetch_image_if_missing(context, vi) [ 1516.351928] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1516.351928] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] image_cache(vi, tmp_image_ds_loc) [ 1516.351928] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1516.351928] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] vm_util.copy_virtual_disk( [ 1516.351928] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1516.351928] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] session._wait_for_task(vmdk_copy_task) [ 1516.351928] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1516.351928] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] return self.wait_for_task(task_ref) [ 1516.351928] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1516.351928] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] return evt.wait() [ 1516.351928] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1516.351928] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] result = hub.switch() [ 1516.351928] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1516.351928] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] return self.greenlet.switch() [ 1516.351928] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1516.351928] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] self.f(*self.args, **self.kw) [ 1516.351928] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1516.351928] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] raise exceptions.translate_fault(task_info.error) [ 1516.351928] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1516.351928] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Faults: ['InvalidArgument'] [ 1516.351928] env[67893]: ERROR nova.compute.manager [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] [ 1516.352914] env[67893]: DEBUG nova.compute.utils [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1516.354056] env[67893]: DEBUG nova.compute.manager [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Build of instance c05df6c1-e4c9-4276-9981-e80e584d540c was re-scheduled: A specified parameter was not correct: fileType [ 1516.354056] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1516.354426] env[67893]: DEBUG nova.compute.manager [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1516.354598] env[67893]: DEBUG nova.compute.manager [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1516.354765] env[67893]: DEBUG nova.compute.manager [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1516.354923] env[67893]: DEBUG nova.network.neutron [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1516.668812] env[67893]: DEBUG nova.network.neutron [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1516.684749] env[67893]: INFO nova.compute.manager [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Took 0.33 seconds to deallocate network for instance. [ 1516.782906] env[67893]: INFO nova.scheduler.client.report [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Deleted allocations for instance c05df6c1-e4c9-4276-9981-e80e584d540c [ 1516.813026] env[67893]: DEBUG oslo_concurrency.lockutils [None req-5cdefab7-a1c7-45d2-bc68-2424815ea57c tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Lock "c05df6c1-e4c9-4276-9981-e80e584d540c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 655.447s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1516.814091] env[67893]: DEBUG oslo_concurrency.lockutils [None req-9b391c00-097d-4240-bc00-1283fadbfb79 tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Lock "c05df6c1-e4c9-4276-9981-e80e584d540c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 459.303s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1516.814311] env[67893]: DEBUG oslo_concurrency.lockutils [None req-9b391c00-097d-4240-bc00-1283fadbfb79 tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Acquiring lock "c05df6c1-e4c9-4276-9981-e80e584d540c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1516.814503] env[67893]: DEBUG oslo_concurrency.lockutils [None req-9b391c00-097d-4240-bc00-1283fadbfb79 tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Lock "c05df6c1-e4c9-4276-9981-e80e584d540c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1516.814660] env[67893]: DEBUG oslo_concurrency.lockutils [None req-9b391c00-097d-4240-bc00-1283fadbfb79 tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Lock "c05df6c1-e4c9-4276-9981-e80e584d540c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1516.816743] env[67893]: INFO nova.compute.manager [None req-9b391c00-097d-4240-bc00-1283fadbfb79 tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Terminating instance [ 1516.818490] env[67893]: DEBUG nova.compute.manager [None req-9b391c00-097d-4240-bc00-1283fadbfb79 tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1516.818731] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-9b391c00-097d-4240-bc00-1283fadbfb79 tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1516.819455] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f134222c-14e7-4e72-ac6f-5de7ffaafcfd {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1516.828121] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c97a21ea-5ded-4905-b9f9-0f4a9ebd0a7c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1516.839033] env[67893]: DEBUG nova.compute.manager [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 81a6ba30-1d0d-4c4b-9aa1-e9af0cd82e0c] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1516.859359] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-9b391c00-097d-4240-bc00-1283fadbfb79 tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c05df6c1-e4c9-4276-9981-e80e584d540c could not be found. [ 1516.859557] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-9b391c00-097d-4240-bc00-1283fadbfb79 tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1516.859728] env[67893]: INFO nova.compute.manager [None req-9b391c00-097d-4240-bc00-1283fadbfb79 tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1516.859967] env[67893]: DEBUG oslo.service.loopingcall [None req-9b391c00-097d-4240-bc00-1283fadbfb79 tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1516.860234] env[67893]: DEBUG nova.compute.manager [-] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1516.860335] env[67893]: DEBUG nova.network.neutron [-] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1516.864541] env[67893]: DEBUG nova.compute.manager [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 81a6ba30-1d0d-4c4b-9aa1-e9af0cd82e0c] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1516.884741] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Lock "81a6ba30-1d0d-4c4b-9aa1-e9af0cd82e0c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 228.722s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1516.887853] env[67893]: DEBUG nova.network.neutron [-] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1516.893261] env[67893]: DEBUG nova.compute.manager [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 14a8db1f-7820-4600-87f4-2788eac02c04] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1516.895829] env[67893]: INFO nova.compute.manager [-] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] Took 0.04 seconds to deallocate network for instance. [ 1516.918011] env[67893]: DEBUG nova.compute.manager [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 14a8db1f-7820-4600-87f4-2788eac02c04] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1516.939998] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Lock "14a8db1f-7820-4600-87f4-2788eac02c04" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 228.751s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1516.951756] env[67893]: DEBUG nova.compute.manager [None req-1af0e92d-5776-48c6-a3d7-52cbf6e57208 tempest-MultipleCreateTestJSON-237684718 tempest-MultipleCreateTestJSON-237684718-project-member] [instance: 676994ff-f4a9-4ea6-8ba7-a4f0ed04e63f] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1516.978405] env[67893]: DEBUG nova.compute.manager [None req-1af0e92d-5776-48c6-a3d7-52cbf6e57208 tempest-MultipleCreateTestJSON-237684718 tempest-MultipleCreateTestJSON-237684718-project-member] [instance: 676994ff-f4a9-4ea6-8ba7-a4f0ed04e63f] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1516.994551] env[67893]: DEBUG oslo_concurrency.lockutils [None req-9b391c00-097d-4240-bc00-1283fadbfb79 tempest-AttachInterfacesTestJSON-1315262687 tempest-AttachInterfacesTestJSON-1315262687-project-member] Lock "c05df6c1-e4c9-4276-9981-e80e584d540c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.180s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1516.995702] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "c05df6c1-e4c9-4276-9981-e80e584d540c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 326.185s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1516.995895] env[67893]: INFO nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: c05df6c1-e4c9-4276-9981-e80e584d540c] During sync_power_state the instance has a pending task (deleting). Skip. [ 1516.996086] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "c05df6c1-e4c9-4276-9981-e80e584d540c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1517.003392] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1af0e92d-5776-48c6-a3d7-52cbf6e57208 tempest-MultipleCreateTestJSON-237684718 tempest-MultipleCreateTestJSON-237684718-project-member] Lock "676994ff-f4a9-4ea6-8ba7-a4f0ed04e63f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 224.891s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1517.012897] env[67893]: DEBUG nova.compute.manager [None req-1af0e92d-5776-48c6-a3d7-52cbf6e57208 tempest-MultipleCreateTestJSON-237684718 tempest-MultipleCreateTestJSON-237684718-project-member] [instance: cfd26f59-2527-4108-9765-9206ff27f4f3] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1517.037910] env[67893]: DEBUG nova.compute.manager [None req-1af0e92d-5776-48c6-a3d7-52cbf6e57208 tempest-MultipleCreateTestJSON-237684718 tempest-MultipleCreateTestJSON-237684718-project-member] [instance: cfd26f59-2527-4108-9765-9206ff27f4f3] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1517.059242] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1af0e92d-5776-48c6-a3d7-52cbf6e57208 tempest-MultipleCreateTestJSON-237684718 tempest-MultipleCreateTestJSON-237684718-project-member] Lock "cfd26f59-2527-4108-9765-9206ff27f4f3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 224.922s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1517.068898] env[67893]: DEBUG nova.compute.manager [None req-06f0052c-6ff2-43b2-b25c-a14e4c7a5bfa tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 279f2b46-c95e-4c6e-a710-7dbfb9edddb5] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1517.092456] env[67893]: DEBUG nova.compute.manager [None req-06f0052c-6ff2-43b2-b25c-a14e4c7a5bfa tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 279f2b46-c95e-4c6e-a710-7dbfb9edddb5] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1517.112999] env[67893]: DEBUG oslo_concurrency.lockutils [None req-06f0052c-6ff2-43b2-b25c-a14e4c7a5bfa tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Lock "279f2b46-c95e-4c6e-a710-7dbfb9edddb5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 222.822s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1517.122971] env[67893]: DEBUG nova.compute.manager [None req-1626100c-4f83-4623-aece-b62a80afe28d tempest-ServersNegativeTestJSON-1739007541 tempest-ServersNegativeTestJSON-1739007541-project-member] [instance: c5c75fd2-96be-49f6-9dcf-f6f2500c751f] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1517.146253] env[67893]: DEBUG nova.compute.manager [None req-1626100c-4f83-4623-aece-b62a80afe28d tempest-ServersNegativeTestJSON-1739007541 tempest-ServersNegativeTestJSON-1739007541-project-member] [instance: c5c75fd2-96be-49f6-9dcf-f6f2500c751f] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1517.168417] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1626100c-4f83-4623-aece-b62a80afe28d tempest-ServersNegativeTestJSON-1739007541 tempest-ServersNegativeTestJSON-1739007541-project-member] Lock "c5c75fd2-96be-49f6-9dcf-f6f2500c751f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 216.434s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1517.178206] env[67893]: DEBUG nova.compute.manager [None req-bd7a1dac-4382-4046-ae6d-a22429bad93e tempest-InstanceActionsV221TestJSON-844008283 tempest-InstanceActionsV221TestJSON-844008283-project-member] [instance: c81f4530-ceb3-4cd6-87b2-143ca3c3e5fb] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1517.205497] env[67893]: DEBUG nova.compute.manager [None req-bd7a1dac-4382-4046-ae6d-a22429bad93e tempest-InstanceActionsV221TestJSON-844008283 tempest-InstanceActionsV221TestJSON-844008283-project-member] [instance: c81f4530-ceb3-4cd6-87b2-143ca3c3e5fb] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1517.252304] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bd7a1dac-4382-4046-ae6d-a22429bad93e tempest-InstanceActionsV221TestJSON-844008283 tempest-InstanceActionsV221TestJSON-844008283-project-member] Lock "c81f4530-ceb3-4cd6-87b2-143ca3c3e5fb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.548s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1517.262217] env[67893]: DEBUG nova.compute.manager [None req-21b16e45-006f-42b0-a4b0-c03dc50846a4 tempest-ServerActionsV293TestJSON-1050756707 tempest-ServerActionsV293TestJSON-1050756707-project-member] [instance: 1a903142-d9fc-41a2-b6db-9330ce2506bf] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1518.302517] env[67893]: DEBUG nova.compute.manager [None req-21b16e45-006f-42b0-a4b0-c03dc50846a4 tempest-ServerActionsV293TestJSON-1050756707 tempest-ServerActionsV293TestJSON-1050756707-project-member] [instance: 1a903142-d9fc-41a2-b6db-9330ce2506bf] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1518.335080] env[67893]: DEBUG oslo_concurrency.lockutils [None req-21b16e45-006f-42b0-a4b0-c03dc50846a4 tempest-ServerActionsV293TestJSON-1050756707 tempest-ServerActionsV293TestJSON-1050756707-project-member] Lock "1a903142-d9fc-41a2-b6db-9330ce2506bf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 197.312s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1518.344536] env[67893]: DEBUG nova.compute.manager [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1518.399154] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1518.399410] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1518.400841] env[67893]: INFO nova.compute.claims [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1518.664064] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-055f0348-61c2-40da-a9a6-5d8108ed9b07 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1518.673021] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0564368c-5222-48d5-93ef-727694daa25d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1518.702772] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ece1d1d-dc3c-4c8c-8eed-e5e855508d9e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1518.709795] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0743cea-b175-46cf-9cb1-9ff919a99a80 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1518.722573] env[67893]: DEBUG nova.compute.provider_tree [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1518.730819] env[67893]: DEBUG nova.scheduler.client.report [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1518.750531] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.351s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1518.751096] env[67893]: DEBUG nova.compute.manager [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1518.786367] env[67893]: DEBUG nova.compute.utils [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1518.788336] env[67893]: DEBUG nova.compute.manager [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1518.788548] env[67893]: DEBUG nova.network.neutron [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1518.798192] env[67893]: DEBUG nova.compute.manager [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1518.856858] env[67893]: DEBUG nova.policy [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9115f73c22bf4b0e9e5439363832061d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7a19d9bde3814325847c06cec1af09b7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 1518.867750] env[67893]: DEBUG nova.compute.manager [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1518.892504] env[67893]: DEBUG nova.virt.hardware [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1518.893239] env[67893]: DEBUG nova.virt.hardware [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1518.894016] env[67893]: DEBUG nova.virt.hardware [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1518.894171] env[67893]: DEBUG nova.virt.hardware [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1518.894461] env[67893]: DEBUG nova.virt.hardware [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1518.898016] env[67893]: DEBUG nova.virt.hardware [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1518.898016] env[67893]: DEBUG nova.virt.hardware [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1518.898016] env[67893]: DEBUG nova.virt.hardware [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1518.898016] env[67893]: DEBUG nova.virt.hardware [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1518.898016] env[67893]: DEBUG nova.virt.hardware [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1518.898016] env[67893]: DEBUG nova.virt.hardware [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1518.898016] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c81a398-72d8-4a39-adac-77b82073fd17 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1518.906265] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0517960-9c50-4f30-99a5-3ef345ee570f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1519.236458] env[67893]: DEBUG nova.network.neutron [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Successfully created port: dbfc0f5f-a5cd-4fa8-bc2a-065ff286442c {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1520.156584] env[67893]: DEBUG nova.compute.manager [req-c101d5a0-6cdd-489e-99d0-f1be6e083646 req-9cf8e841-c480-4e28-933a-3840dcd9d468 service nova] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Received event network-vif-plugged-dbfc0f5f-a5cd-4fa8-bc2a-065ff286442c {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1520.156835] env[67893]: DEBUG oslo_concurrency.lockutils [req-c101d5a0-6cdd-489e-99d0-f1be6e083646 req-9cf8e841-c480-4e28-933a-3840dcd9d468 service nova] Acquiring lock "e1849daf-3781-42ef-bede-267efbb652c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1520.157017] env[67893]: DEBUG oslo_concurrency.lockutils [req-c101d5a0-6cdd-489e-99d0-f1be6e083646 req-9cf8e841-c480-4e28-933a-3840dcd9d468 service nova] Lock "e1849daf-3781-42ef-bede-267efbb652c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1520.157196] env[67893]: DEBUG oslo_concurrency.lockutils [req-c101d5a0-6cdd-489e-99d0-f1be6e083646 req-9cf8e841-c480-4e28-933a-3840dcd9d468 service nova] Lock "e1849daf-3781-42ef-bede-267efbb652c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1520.157361] env[67893]: DEBUG nova.compute.manager [req-c101d5a0-6cdd-489e-99d0-f1be6e083646 req-9cf8e841-c480-4e28-933a-3840dcd9d468 service nova] [instance: e1849daf-3781-42ef-bede-267efbb652c9] No waiting events found dispatching network-vif-plugged-dbfc0f5f-a5cd-4fa8-bc2a-065ff286442c {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1520.157523] env[67893]: WARNING nova.compute.manager [req-c101d5a0-6cdd-489e-99d0-f1be6e083646 req-9cf8e841-c480-4e28-933a-3840dcd9d468 service nova] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Received unexpected event network-vif-plugged-dbfc0f5f-a5cd-4fa8-bc2a-065ff286442c for instance with vm_state building and task_state spawning. [ 1520.245835] env[67893]: DEBUG nova.network.neutron [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Successfully updated port: dbfc0f5f-a5cd-4fa8-bc2a-065ff286442c {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1520.260162] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "refresh_cache-e1849daf-3781-42ef-bede-267efbb652c9" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1520.260319] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquired lock "refresh_cache-e1849daf-3781-42ef-bede-267efbb652c9" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1520.260466] env[67893]: DEBUG nova.network.neutron [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1520.297309] env[67893]: DEBUG nova.network.neutron [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1520.536532] env[67893]: DEBUG nova.network.neutron [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Updating instance_info_cache with network_info: [{"id": "dbfc0f5f-a5cd-4fa8-bc2a-065ff286442c", "address": "fa:16:3e:7e:48:83", "network": {"id": "b5038471-f3b2-4f1f-b2f9-62effa71f1aa", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1405799721-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7a19d9bde3814325847c06cec1af09b7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "89ef02af-c508-432f-ae29-3a219701d584", "external-id": "nsx-vlan-transportzone-313", "segmentation_id": 313, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdbfc0f5f-a5", "ovs_interfaceid": "dbfc0f5f-a5cd-4fa8-bc2a-065ff286442c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1520.548930] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Releasing lock "refresh_cache-e1849daf-3781-42ef-bede-267efbb652c9" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1520.549255] env[67893]: DEBUG nova.compute.manager [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Instance network_info: |[{"id": "dbfc0f5f-a5cd-4fa8-bc2a-065ff286442c", "address": "fa:16:3e:7e:48:83", "network": {"id": "b5038471-f3b2-4f1f-b2f9-62effa71f1aa", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1405799721-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7a19d9bde3814325847c06cec1af09b7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "89ef02af-c508-432f-ae29-3a219701d584", "external-id": "nsx-vlan-transportzone-313", "segmentation_id": 313, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdbfc0f5f-a5", "ovs_interfaceid": "dbfc0f5f-a5cd-4fa8-bc2a-065ff286442c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1520.549651] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:7e:48:83', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '89ef02af-c508-432f-ae29-3a219701d584', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'dbfc0f5f-a5cd-4fa8-bc2a-065ff286442c', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1520.557130] env[67893]: DEBUG oslo.service.loopingcall [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1520.557739] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1520.558031] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ce71f364-4f5a-462c-9c11-860177cee6c0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1520.579264] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1520.579264] env[67893]: value = "task-3455445" [ 1520.579264] env[67893]: _type = "Task" [ 1520.579264] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1520.587033] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455445, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1521.089659] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455445, 'name': CreateVM_Task, 'duration_secs': 0.30317} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1521.089821] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1521.090500] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1521.090685] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1521.091107] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1521.091365] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7b7d7811-a586-4907-88ab-6684b9d49be1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1521.095640] env[67893]: DEBUG oslo_vmware.api [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Waiting for the task: (returnval){ [ 1521.095640] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52464fb1-8933-e27b-74d8-60ff1d9f9f74" [ 1521.095640] env[67893]: _type = "Task" [ 1521.095640] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1521.102804] env[67893]: DEBUG oslo_vmware.api [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52464fb1-8933-e27b-74d8-60ff1d9f9f74, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1521.607181] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1521.607476] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1521.607750] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1522.190616] env[67893]: DEBUG nova.compute.manager [req-1455a685-eb7b-44d0-9124-964824d813e4 req-d6fad6ff-97d7-4f16-be66-4dc7f59c3fc6 service nova] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Received event network-changed-dbfc0f5f-a5cd-4fa8-bc2a-065ff286442c {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1522.190917] env[67893]: DEBUG nova.compute.manager [req-1455a685-eb7b-44d0-9124-964824d813e4 req-d6fad6ff-97d7-4f16-be66-4dc7f59c3fc6 service nova] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Refreshing instance network info cache due to event network-changed-dbfc0f5f-a5cd-4fa8-bc2a-065ff286442c. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1522.191176] env[67893]: DEBUG oslo_concurrency.lockutils [req-1455a685-eb7b-44d0-9124-964824d813e4 req-d6fad6ff-97d7-4f16-be66-4dc7f59c3fc6 service nova] Acquiring lock "refresh_cache-e1849daf-3781-42ef-bede-267efbb652c9" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1522.191331] env[67893]: DEBUG oslo_concurrency.lockutils [req-1455a685-eb7b-44d0-9124-964824d813e4 req-d6fad6ff-97d7-4f16-be66-4dc7f59c3fc6 service nova] Acquired lock "refresh_cache-e1849daf-3781-42ef-bede-267efbb652c9" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1522.191492] env[67893]: DEBUG nova.network.neutron [req-1455a685-eb7b-44d0-9124-964824d813e4 req-d6fad6ff-97d7-4f16-be66-4dc7f59c3fc6 service nova] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Refreshing network info cache for port dbfc0f5f-a5cd-4fa8-bc2a-065ff286442c {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1522.435696] env[67893]: DEBUG nova.network.neutron [req-1455a685-eb7b-44d0-9124-964824d813e4 req-d6fad6ff-97d7-4f16-be66-4dc7f59c3fc6 service nova] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Updated VIF entry in instance network info cache for port dbfc0f5f-a5cd-4fa8-bc2a-065ff286442c. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1522.436066] env[67893]: DEBUG nova.network.neutron [req-1455a685-eb7b-44d0-9124-964824d813e4 req-d6fad6ff-97d7-4f16-be66-4dc7f59c3fc6 service nova] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Updating instance_info_cache with network_info: [{"id": "dbfc0f5f-a5cd-4fa8-bc2a-065ff286442c", "address": "fa:16:3e:7e:48:83", "network": {"id": "b5038471-f3b2-4f1f-b2f9-62effa71f1aa", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1405799721-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7a19d9bde3814325847c06cec1af09b7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "89ef02af-c508-432f-ae29-3a219701d584", "external-id": "nsx-vlan-transportzone-313", "segmentation_id": 313, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdbfc0f5f-a5", "ovs_interfaceid": "dbfc0f5f-a5cd-4fa8-bc2a-065ff286442c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1522.445351] env[67893]: DEBUG oslo_concurrency.lockutils [req-1455a685-eb7b-44d0-9124-964824d813e4 req-d6fad6ff-97d7-4f16-be66-4dc7f59c3fc6 service nova] Releasing lock "refresh_cache-e1849daf-3781-42ef-bede-267efbb652c9" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1542.518122] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1546.859042] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1546.859042] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1546.859476] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1546.879999] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1546.880160] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1546.880287] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1546.880414] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1546.880539] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1546.880664] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1546.880784] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1546.880909] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1546.881039] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1546.881382] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1546.881582] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1546.882120] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1549.859266] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1549.859637] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1550.854137] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1551.859460] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1551.859795] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1552.859841] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1553.854788] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1554.859198] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1554.871359] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1554.871623] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1554.871733] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1554.871889] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1554.873374] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fabb2c1-8dc1-4f5c-9626-204c3e73d53b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1554.881789] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dae1fa54-b045-4816-a64b-ec092ff4f07b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1554.895797] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-606c03de-e3e7-4883-b4b9-d55b73a7cadb {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1554.902048] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48ee61bf-0ec8-407f-a027-f3d5b1b4d655 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1554.933910] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180966MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1554.934065] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1554.934262] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1555.009254] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5a24adaf-bced-4488-9ccb-fc996b2ba154 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1555.009422] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance efdb0a7e-403d-4de5-8c09-72b9c8f9cd79 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1555.009553] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5ede1991-efee-4c34-af5b-ce71f67456ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1555.009677] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance b3d31ca3-9a7a-49d0-955f-1e12808bf11f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1555.009798] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 8dbbc2e6-9993-4bf0-b66b-6e685789221c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1555.009914] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1555.010053] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 021f1a86-6015-4a22-b501-3ec9079edbec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1555.010187] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 25d67f98-c132-434b-9d22-4569585527eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1555.010331] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 41b5c5ec-936a-4abe-9db7-38d0d2aa371d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1555.010454] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance e1849daf-3781-42ef-bede-267efbb652c9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1555.021388] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2875b0a3-0213-4908-b86b-ce45a8901553 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1555.031709] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance dfb92d1c-c2a5-49c1-8526-3743cb385c97 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1555.041565] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance ad60df35-54c0-459e-8a25-981922ae0a88 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1555.053650] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance b2c40a66-699c-4185-8ffa-85dbfc4463c5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1555.062627] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 1c6116c0-84c2-40bd-84d2-bf1f4a5b9a10 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1555.062844] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1555.062992] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1555.223443] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c8bdb98-e3e0-4862-964c-1e4ce72792ef {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1555.231239] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e034042-3ab9-4821-bf24-6b07fab2926a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1555.259625] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6c88038-fe18-467e-9c7c-eccffe0bb62c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1555.266381] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3235a17-2e0e-42fc-9a63-eb7d267c407b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1555.278991] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1555.288024] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1555.301035] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1555.301218] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.367s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1564.893960] env[67893]: WARNING oslo_vmware.rw_handles [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1564.893960] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1564.893960] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1564.893960] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1564.893960] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1564.893960] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 1564.893960] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1564.893960] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1564.893960] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1564.893960] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1564.893960] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1564.893960] env[67893]: ERROR oslo_vmware.rw_handles [ 1564.894624] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/9bc9d6ef-811f-434b-b75d-35439e66de95/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1564.896211] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1564.896449] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Copying Virtual Disk [datastore1] vmware_temp/9bc9d6ef-811f-434b-b75d-35439e66de95/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/9bc9d6ef-811f-434b-b75d-35439e66de95/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1564.896722] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d2cced52-73cb-48c2-96a6-c152efe750c4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1564.904721] env[67893]: DEBUG oslo_vmware.api [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Waiting for the task: (returnval){ [ 1564.904721] env[67893]: value = "task-3455446" [ 1564.904721] env[67893]: _type = "Task" [ 1564.904721] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1564.913495] env[67893]: DEBUG oslo_vmware.api [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Task: {'id': task-3455446, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1565.415835] env[67893]: DEBUG oslo_vmware.exceptions [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1565.416183] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1565.416741] env[67893]: ERROR nova.compute.manager [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1565.416741] env[67893]: Faults: ['InvalidArgument'] [ 1565.416741] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Traceback (most recent call last): [ 1565.416741] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1565.416741] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] yield resources [ 1565.416741] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1565.416741] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] self.driver.spawn(context, instance, image_meta, [ 1565.416741] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1565.416741] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1565.416741] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1565.416741] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] self._fetch_image_if_missing(context, vi) [ 1565.416741] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1565.416741] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] image_cache(vi, tmp_image_ds_loc) [ 1565.416741] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1565.416741] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] vm_util.copy_virtual_disk( [ 1565.416741] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1565.416741] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] session._wait_for_task(vmdk_copy_task) [ 1565.416741] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1565.416741] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] return self.wait_for_task(task_ref) [ 1565.416741] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1565.416741] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] return evt.wait() [ 1565.416741] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1565.416741] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] result = hub.switch() [ 1565.416741] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1565.416741] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] return self.greenlet.switch() [ 1565.416741] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1565.416741] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] self.f(*self.args, **self.kw) [ 1565.416741] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1565.416741] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] raise exceptions.translate_fault(task_info.error) [ 1565.416741] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1565.416741] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Faults: ['InvalidArgument'] [ 1565.416741] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] [ 1565.417744] env[67893]: INFO nova.compute.manager [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Terminating instance [ 1565.418592] env[67893]: DEBUG oslo_concurrency.lockutils [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1565.418792] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1565.419041] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-39273011-f546-42ce-a530-00dc8ac14ff5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1565.421356] env[67893]: DEBUG nova.compute.manager [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1565.421540] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1565.422251] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5844f7e4-1ff2-4892-b654-e9745bd2227c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1565.428714] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1565.428925] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-07d21038-8f37-4cab-8738-42f264389023 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1565.431015] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1565.431195] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1565.432092] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5a2203e8-151d-42c7-bcd3-a9494cf9082b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1565.436622] env[67893]: DEBUG oslo_vmware.api [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Waiting for the task: (returnval){ [ 1565.436622] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]525c66fb-8c94-f85b-2c6c-ec8b470ac569" [ 1565.436622] env[67893]: _type = "Task" [ 1565.436622] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1565.443488] env[67893]: DEBUG oslo_vmware.api [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]525c66fb-8c94-f85b-2c6c-ec8b470ac569, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1565.496623] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1565.496840] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1565.497028] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Deleting the datastore file [datastore1] 5a24adaf-bced-4488-9ccb-fc996b2ba154 {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1565.497331] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9319917f-1760-461d-9c33-01557af6f603 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1565.503264] env[67893]: DEBUG oslo_vmware.api [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Waiting for the task: (returnval){ [ 1565.503264] env[67893]: value = "task-3455448" [ 1565.503264] env[67893]: _type = "Task" [ 1565.503264] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1565.510856] env[67893]: DEBUG oslo_vmware.api [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Task: {'id': task-3455448, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1565.946386] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1565.946670] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Creating directory with path [datastore1] vmware_temp/390b879b-b664-4a26-a7a6-b07671edb25c/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1565.946872] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e0c5d5d1-4c8c-469b-83f7-b10623bd0912 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1565.957701] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Created directory with path [datastore1] vmware_temp/390b879b-b664-4a26-a7a6-b07671edb25c/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1565.957916] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Fetch image to [datastore1] vmware_temp/390b879b-b664-4a26-a7a6-b07671edb25c/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1565.958081] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/390b879b-b664-4a26-a7a6-b07671edb25c/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1565.958789] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89be8c99-dd1a-4705-8c7a-f99631ab45dc {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1565.965042] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f84184b-25cf-4892-a0b4-e133142b0b4a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1565.973732] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f84398ab-ffe8-42c1-a843-57294592fcc2 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1566.003122] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99b94584-691f-4e0a-8619-a42895004468 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1566.014676] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-fb5bb87b-5541-4851-824f-3cfb471ca278 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1566.016335] env[67893]: DEBUG oslo_vmware.api [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Task: {'id': task-3455448, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07447} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1566.016641] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1566.016758] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1566.016925] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1566.017107] env[67893]: INFO nova.compute.manager [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1566.019209] env[67893]: DEBUG nova.compute.claims [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1566.019386] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1566.019601] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1566.040578] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1566.258701] env[67893]: DEBUG oslo_vmware.rw_handles [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/390b879b-b664-4a26-a7a6-b07671edb25c/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1566.316072] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e199c90-e18f-4a4d-827e-270629f09f6a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1566.321096] env[67893]: DEBUG oslo_vmware.rw_handles [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1566.321455] env[67893]: DEBUG oslo_vmware.rw_handles [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/390b879b-b664-4a26-a7a6-b07671edb25c/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1566.325622] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-789b4a68-abf2-4c47-b373-19c68d39b45e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1566.356507] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5051a54a-3264-4e23-9c13-3b2d11ca75bd {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1566.367039] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fa71f79-9a13-47e6-80a8-2341ee52637e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1566.378552] env[67893]: DEBUG nova.compute.provider_tree [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1566.388011] env[67893]: DEBUG nova.scheduler.client.report [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1566.401263] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.382s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1566.401847] env[67893]: ERROR nova.compute.manager [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1566.401847] env[67893]: Faults: ['InvalidArgument'] [ 1566.401847] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Traceback (most recent call last): [ 1566.401847] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1566.401847] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] self.driver.spawn(context, instance, image_meta, [ 1566.401847] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1566.401847] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1566.401847] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1566.401847] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] self._fetch_image_if_missing(context, vi) [ 1566.401847] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1566.401847] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] image_cache(vi, tmp_image_ds_loc) [ 1566.401847] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1566.401847] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] vm_util.copy_virtual_disk( [ 1566.401847] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1566.401847] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] session._wait_for_task(vmdk_copy_task) [ 1566.401847] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1566.401847] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] return self.wait_for_task(task_ref) [ 1566.401847] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1566.401847] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] return evt.wait() [ 1566.401847] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1566.401847] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] result = hub.switch() [ 1566.401847] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1566.401847] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] return self.greenlet.switch() [ 1566.401847] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1566.401847] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] self.f(*self.args, **self.kw) [ 1566.401847] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1566.401847] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] raise exceptions.translate_fault(task_info.error) [ 1566.401847] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1566.401847] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Faults: ['InvalidArgument'] [ 1566.401847] env[67893]: ERROR nova.compute.manager [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] [ 1566.402780] env[67893]: DEBUG nova.compute.utils [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1566.403943] env[67893]: DEBUG nova.compute.manager [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Build of instance 5a24adaf-bced-4488-9ccb-fc996b2ba154 was re-scheduled: A specified parameter was not correct: fileType [ 1566.403943] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1566.404342] env[67893]: DEBUG nova.compute.manager [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1566.404528] env[67893]: DEBUG nova.compute.manager [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1566.404705] env[67893]: DEBUG nova.compute.manager [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1566.404866] env[67893]: DEBUG nova.network.neutron [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1566.743413] env[67893]: DEBUG nova.network.neutron [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1566.758043] env[67893]: INFO nova.compute.manager [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Took 0.35 seconds to deallocate network for instance. [ 1566.876749] env[67893]: INFO nova.scheduler.client.report [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Deleted allocations for instance 5a24adaf-bced-4488-9ccb-fc996b2ba154 [ 1566.899411] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f626f805-4f3d-47e8-9bba-0c98b8f0497d tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Lock "5a24adaf-bced-4488-9ccb-fc996b2ba154" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 689.358s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1566.900621] env[67893]: DEBUG oslo_concurrency.lockutils [None req-789edfb5-549f-452d-8221-662354d8320b tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Lock "5a24adaf-bced-4488-9ccb-fc996b2ba154" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 488.461s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1566.900839] env[67893]: DEBUG oslo_concurrency.lockutils [None req-789edfb5-549f-452d-8221-662354d8320b tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Acquiring lock "5a24adaf-bced-4488-9ccb-fc996b2ba154-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1566.901052] env[67893]: DEBUG oslo_concurrency.lockutils [None req-789edfb5-549f-452d-8221-662354d8320b tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Lock "5a24adaf-bced-4488-9ccb-fc996b2ba154-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1566.901261] env[67893]: DEBUG oslo_concurrency.lockutils [None req-789edfb5-549f-452d-8221-662354d8320b tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Lock "5a24adaf-bced-4488-9ccb-fc996b2ba154-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1566.904634] env[67893]: INFO nova.compute.manager [None req-789edfb5-549f-452d-8221-662354d8320b tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Terminating instance [ 1566.906320] env[67893]: DEBUG nova.compute.manager [None req-789edfb5-549f-452d-8221-662354d8320b tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1566.906505] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-789edfb5-549f-452d-8221-662354d8320b tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1566.906764] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-3fe6348a-4da2-441f-97fb-c2b4797c53d6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1566.911541] env[67893]: DEBUG nova.compute.manager [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1566.918038] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f473e557-3a38-4015-a6d4-37728545ebd5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1566.947400] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-789edfb5-549f-452d-8221-662354d8320b tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 5a24adaf-bced-4488-9ccb-fc996b2ba154 could not be found. [ 1566.947729] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-789edfb5-549f-452d-8221-662354d8320b tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1566.947832] env[67893]: INFO nova.compute.manager [None req-789edfb5-549f-452d-8221-662354d8320b tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1566.948134] env[67893]: DEBUG oslo.service.loopingcall [None req-789edfb5-549f-452d-8221-662354d8320b tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1566.948934] env[67893]: DEBUG nova.compute.manager [-] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1566.949048] env[67893]: DEBUG nova.network.neutron [-] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1566.966701] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1566.966943] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1566.968430] env[67893]: INFO nova.compute.claims [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1566.984023] env[67893]: DEBUG nova.network.neutron [-] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1567.002735] env[67893]: INFO nova.compute.manager [-] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] Took 0.05 seconds to deallocate network for instance. [ 1567.096021] env[67893]: DEBUG oslo_concurrency.lockutils [None req-789edfb5-549f-452d-8221-662354d8320b tempest-ServerRescueNegativeTestJSON-558410913 tempest-ServerRescueNegativeTestJSON-558410913-project-member] Lock "5a24adaf-bced-4488-9ccb-fc996b2ba154" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.195s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1567.097067] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "5a24adaf-bced-4488-9ccb-fc996b2ba154" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 376.286s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1567.097067] env[67893]: INFO nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 5a24adaf-bced-4488-9ccb-fc996b2ba154] During sync_power_state the instance has a pending task (deleting). Skip. [ 1567.097067] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "5a24adaf-bced-4488-9ccb-fc996b2ba154" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1567.191812] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-737d3858-760c-4510-8622-3e9f21019930 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1567.198916] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a6c40a9-d0f5-4737-bf6d-518f4716c767 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1567.228663] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3dfd650e-a887-4716-bce8-d93bea7d7c5c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1567.235883] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65c1a867-e057-43eb-84f8-92ee21b0159e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1567.249900] env[67893]: DEBUG nova.compute.provider_tree [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1567.258724] env[67893]: DEBUG nova.scheduler.client.report [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1567.272637] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.306s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1567.273193] env[67893]: DEBUG nova.compute.manager [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1567.302800] env[67893]: DEBUG nova.compute.utils [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1567.304027] env[67893]: DEBUG nova.compute.manager [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1567.304132] env[67893]: DEBUG nova.network.neutron [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1567.316566] env[67893]: DEBUG nova.compute.manager [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1567.360912] env[67893]: DEBUG nova.policy [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '894285baafaf410ea301f676b78c45f4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b439a6039a714a6fabd3c0477629d3c1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 1567.378143] env[67893]: DEBUG nova.compute.manager [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1567.402822] env[67893]: DEBUG nova.virt.hardware [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1567.403074] env[67893]: DEBUG nova.virt.hardware [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1567.403239] env[67893]: DEBUG nova.virt.hardware [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1567.403420] env[67893]: DEBUG nova.virt.hardware [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1567.403566] env[67893]: DEBUG nova.virt.hardware [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1567.403706] env[67893]: DEBUG nova.virt.hardware [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1567.403947] env[67893]: DEBUG nova.virt.hardware [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1567.404170] env[67893]: DEBUG nova.virt.hardware [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1567.404366] env[67893]: DEBUG nova.virt.hardware [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1567.404570] env[67893]: DEBUG nova.virt.hardware [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1567.404763] env[67893]: DEBUG nova.virt.hardware [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1567.405683] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73dd3f64-9aba-4e4c-bcf1-30ecb593871f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1567.413338] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88748e5e-1057-4746-85ca-edf7a29db67f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1567.675227] env[67893]: DEBUG nova.network.neutron [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Successfully created port: 6f6765f2-3c7a-4b9d-b669-0f94f73c1422 {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1568.384028] env[67893]: DEBUG nova.network.neutron [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Successfully updated port: 6f6765f2-3c7a-4b9d-b669-0f94f73c1422 {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1568.402686] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquiring lock "refresh_cache-2875b0a3-0213-4908-b86b-ce45a8901553" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1568.402849] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquired lock "refresh_cache-2875b0a3-0213-4908-b86b-ce45a8901553" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1568.402995] env[67893]: DEBUG nova.network.neutron [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1568.438344] env[67893]: DEBUG nova.network.neutron [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1568.632355] env[67893]: DEBUG nova.network.neutron [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Updating instance_info_cache with network_info: [{"id": "6f6765f2-3c7a-4b9d-b669-0f94f73c1422", "address": "fa:16:3e:83:59:b9", "network": {"id": "3269c624-7a70-494c-85bc-8230ffbbab83", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-740576182-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b439a6039a714a6fabd3c0477629d3c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6f6765f2-3c", "ovs_interfaceid": "6f6765f2-3c7a-4b9d-b669-0f94f73c1422", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1568.646060] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Releasing lock "refresh_cache-2875b0a3-0213-4908-b86b-ce45a8901553" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1568.646060] env[67893]: DEBUG nova.compute.manager [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Instance network_info: |[{"id": "6f6765f2-3c7a-4b9d-b669-0f94f73c1422", "address": "fa:16:3e:83:59:b9", "network": {"id": "3269c624-7a70-494c-85bc-8230ffbbab83", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-740576182-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b439a6039a714a6fabd3c0477629d3c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6f6765f2-3c", "ovs_interfaceid": "6f6765f2-3c7a-4b9d-b669-0f94f73c1422", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1568.646060] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:83:59:b9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'fa01fe1a-83b6-4c10-af75-00ddb17f9bbf', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6f6765f2-3c7a-4b9d-b669-0f94f73c1422', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1568.652943] env[67893]: DEBUG oslo.service.loopingcall [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1568.653426] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1568.653663] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b915d5a0-05ff-4b91-8fbe-182abe36bb5f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1568.673905] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1568.673905] env[67893]: value = "task-3455449" [ 1568.673905] env[67893]: _type = "Task" [ 1568.673905] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1568.687543] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455449, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1568.804524] env[67893]: DEBUG nova.compute.manager [req-7e4d788d-2741-4102-be9c-9c8fbc32840a req-28615044-3516-4fea-9157-37619423708e service nova] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Received event network-vif-plugged-6f6765f2-3c7a-4b9d-b669-0f94f73c1422 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1568.804795] env[67893]: DEBUG oslo_concurrency.lockutils [req-7e4d788d-2741-4102-be9c-9c8fbc32840a req-28615044-3516-4fea-9157-37619423708e service nova] Acquiring lock "2875b0a3-0213-4908-b86b-ce45a8901553-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1568.804973] env[67893]: DEBUG oslo_concurrency.lockutils [req-7e4d788d-2741-4102-be9c-9c8fbc32840a req-28615044-3516-4fea-9157-37619423708e service nova] Lock "2875b0a3-0213-4908-b86b-ce45a8901553-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1568.805317] env[67893]: DEBUG oslo_concurrency.lockutils [req-7e4d788d-2741-4102-be9c-9c8fbc32840a req-28615044-3516-4fea-9157-37619423708e service nova] Lock "2875b0a3-0213-4908-b86b-ce45a8901553-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1568.805504] env[67893]: DEBUG nova.compute.manager [req-7e4d788d-2741-4102-be9c-9c8fbc32840a req-28615044-3516-4fea-9157-37619423708e service nova] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] No waiting events found dispatching network-vif-plugged-6f6765f2-3c7a-4b9d-b669-0f94f73c1422 {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1568.805670] env[67893]: WARNING nova.compute.manager [req-7e4d788d-2741-4102-be9c-9c8fbc32840a req-28615044-3516-4fea-9157-37619423708e service nova] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Received unexpected event network-vif-plugged-6f6765f2-3c7a-4b9d-b669-0f94f73c1422 for instance with vm_state building and task_state spawning. [ 1568.805828] env[67893]: DEBUG nova.compute.manager [req-7e4d788d-2741-4102-be9c-9c8fbc32840a req-28615044-3516-4fea-9157-37619423708e service nova] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Received event network-changed-6f6765f2-3c7a-4b9d-b669-0f94f73c1422 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1568.805981] env[67893]: DEBUG nova.compute.manager [req-7e4d788d-2741-4102-be9c-9c8fbc32840a req-28615044-3516-4fea-9157-37619423708e service nova] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Refreshing instance network info cache due to event network-changed-6f6765f2-3c7a-4b9d-b669-0f94f73c1422. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1568.806204] env[67893]: DEBUG oslo_concurrency.lockutils [req-7e4d788d-2741-4102-be9c-9c8fbc32840a req-28615044-3516-4fea-9157-37619423708e service nova] Acquiring lock "refresh_cache-2875b0a3-0213-4908-b86b-ce45a8901553" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1568.806350] env[67893]: DEBUG oslo_concurrency.lockutils [req-7e4d788d-2741-4102-be9c-9c8fbc32840a req-28615044-3516-4fea-9157-37619423708e service nova] Acquired lock "refresh_cache-2875b0a3-0213-4908-b86b-ce45a8901553" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1568.806503] env[67893]: DEBUG nova.network.neutron [req-7e4d788d-2741-4102-be9c-9c8fbc32840a req-28615044-3516-4fea-9157-37619423708e service nova] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Refreshing network info cache for port 6f6765f2-3c7a-4b9d-b669-0f94f73c1422 {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1569.194875] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455449, 'name': CreateVM_Task, 'duration_secs': 0.301887} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1569.195040] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1569.195760] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1569.195933] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1569.196349] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1569.196624] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-dcea2851-d217-44ab-8023-961de648cc2a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1569.201545] env[67893]: DEBUG oslo_vmware.api [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Waiting for the task: (returnval){ [ 1569.201545] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52a1d341-2b0c-129e-f39d-344476e696f9" [ 1569.201545] env[67893]: _type = "Task" [ 1569.201545] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1569.218947] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1569.219223] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1569.219424] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1569.336590] env[67893]: DEBUG nova.network.neutron [req-7e4d788d-2741-4102-be9c-9c8fbc32840a req-28615044-3516-4fea-9157-37619423708e service nova] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Updated VIF entry in instance network info cache for port 6f6765f2-3c7a-4b9d-b669-0f94f73c1422. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1569.336590] env[67893]: DEBUG nova.network.neutron [req-7e4d788d-2741-4102-be9c-9c8fbc32840a req-28615044-3516-4fea-9157-37619423708e service nova] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Updating instance_info_cache with network_info: [{"id": "6f6765f2-3c7a-4b9d-b669-0f94f73c1422", "address": "fa:16:3e:83:59:b9", "network": {"id": "3269c624-7a70-494c-85bc-8230ffbbab83", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-740576182-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b439a6039a714a6fabd3c0477629d3c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6f6765f2-3c", "ovs_interfaceid": "6f6765f2-3c7a-4b9d-b669-0f94f73c1422", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1569.345272] env[67893]: DEBUG oslo_concurrency.lockutils [req-7e4d788d-2741-4102-be9c-9c8fbc32840a req-28615044-3516-4fea-9157-37619423708e service nova] Releasing lock "refresh_cache-2875b0a3-0213-4908-b86b-ce45a8901553" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1578.648449] env[67893]: DEBUG oslo_concurrency.lockutils [None req-269d84f0-8244-42b8-8300-227cbcea2b2a tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "e1849daf-3781-42ef-bede-267efbb652c9" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1580.378286] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Acquiring lock "9fc9f6b0-928e-46b4-ad7c-9217b2f31575" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1580.378616] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Lock "9fc9f6b0-928e-46b4-ad7c-9217b2f31575" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1603.301622] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1607.858778] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1607.859101] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1607.859101] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1607.880684] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1607.880796] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1607.880885] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1607.881085] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1607.881266] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1607.881393] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1607.881524] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1607.881652] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1607.881771] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1607.881887] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1607.882010] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1607.882489] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1611.859569] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1611.859875] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1611.859979] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1612.529439] env[67893]: WARNING oslo_vmware.rw_handles [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1612.529439] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1612.529439] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1612.529439] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1612.529439] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1612.529439] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 1612.529439] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1612.529439] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1612.529439] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1612.529439] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1612.529439] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1612.529439] env[67893]: ERROR oslo_vmware.rw_handles [ 1612.530108] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/390b879b-b664-4a26-a7a6-b07671edb25c/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1612.531886] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1612.532136] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Copying Virtual Disk [datastore1] vmware_temp/390b879b-b664-4a26-a7a6-b07671edb25c/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/390b879b-b664-4a26-a7a6-b07671edb25c/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1612.532418] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f3903727-f026-4c5a-b829-2792c672e090 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1612.540104] env[67893]: DEBUG oslo_vmware.api [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Waiting for the task: (returnval){ [ 1612.540104] env[67893]: value = "task-3455450" [ 1612.540104] env[67893]: _type = "Task" [ 1612.540104] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1612.547896] env[67893]: DEBUG oslo_vmware.api [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Task: {'id': task-3455450, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1612.858813] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1613.050723] env[67893]: DEBUG oslo_vmware.exceptions [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1613.051088] env[67893]: DEBUG oslo_concurrency.lockutils [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1613.051572] env[67893]: ERROR nova.compute.manager [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1613.051572] env[67893]: Faults: ['InvalidArgument'] [ 1613.051572] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Traceback (most recent call last): [ 1613.051572] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1613.051572] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] yield resources [ 1613.051572] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1613.051572] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] self.driver.spawn(context, instance, image_meta, [ 1613.051572] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1613.051572] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1613.051572] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1613.051572] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] self._fetch_image_if_missing(context, vi) [ 1613.051572] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1613.051572] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] image_cache(vi, tmp_image_ds_loc) [ 1613.051572] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1613.051572] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] vm_util.copy_virtual_disk( [ 1613.051572] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1613.051572] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] session._wait_for_task(vmdk_copy_task) [ 1613.051572] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1613.051572] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] return self.wait_for_task(task_ref) [ 1613.051572] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1613.051572] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] return evt.wait() [ 1613.051572] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1613.051572] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] result = hub.switch() [ 1613.051572] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1613.051572] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] return self.greenlet.switch() [ 1613.051572] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1613.051572] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] self.f(*self.args, **self.kw) [ 1613.051572] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1613.051572] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] raise exceptions.translate_fault(task_info.error) [ 1613.051572] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1613.051572] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Faults: ['InvalidArgument'] [ 1613.051572] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] [ 1613.052881] env[67893]: INFO nova.compute.manager [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Terminating instance [ 1613.053431] env[67893]: DEBUG oslo_concurrency.lockutils [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1613.053685] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1613.053941] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-47d52d7f-511b-43aa-b855-eb0db683febd {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1613.056346] env[67893]: DEBUG nova.compute.manager [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1613.056539] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1613.057259] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf464b9f-a906-4088-9f4d-83c38c2cf6a5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1613.063506] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1613.063721] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-086e8bed-e8e8-445b-b9b3-278f855bc667 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1613.065816] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1613.065994] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1613.067035] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-88f2a135-b787-4a19-ae4d-4ffa094341c8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1613.071667] env[67893]: DEBUG oslo_vmware.api [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Waiting for the task: (returnval){ [ 1613.071667] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52156f8b-c412-38f1-2530-eac4b34f10b7" [ 1613.071667] env[67893]: _type = "Task" [ 1613.071667] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1613.078281] env[67893]: DEBUG oslo_vmware.api [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52156f8b-c412-38f1-2530-eac4b34f10b7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1613.138110] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1613.138504] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1613.138767] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Deleting the datastore file [datastore1] efdb0a7e-403d-4de5-8c09-72b9c8f9cd79 {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1613.139168] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6b1bb30e-bd6a-41a1-a1bf-2380dc674c17 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1613.146580] env[67893]: DEBUG oslo_vmware.api [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Waiting for the task: (returnval){ [ 1613.146580] env[67893]: value = "task-3455452" [ 1613.146580] env[67893]: _type = "Task" [ 1613.146580] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1613.157086] env[67893]: DEBUG oslo_vmware.api [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Task: {'id': task-3455452, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1613.581845] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1613.582137] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Creating directory with path [datastore1] vmware_temp/83748bc2-bc5f-40d9-81d8-a7b9e336e812/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1613.582375] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f7bbca52-422c-44f7-b38a-50b1fdcaf97a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1613.593787] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Created directory with path [datastore1] vmware_temp/83748bc2-bc5f-40d9-81d8-a7b9e336e812/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1613.593980] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Fetch image to [datastore1] vmware_temp/83748bc2-bc5f-40d9-81d8-a7b9e336e812/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1613.594170] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/83748bc2-bc5f-40d9-81d8-a7b9e336e812/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1613.594892] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0485aa2b-4f7b-4823-b74c-f9b779941554 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1613.601522] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-899389d3-2296-40ba-8469-c16143362982 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1613.610198] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-337db432-c401-4190-8133-370347455164 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1613.640853] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5764601f-1198-41e0-ab17-050fc1aafc3a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1613.646559] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-79b1e9af-7e84-4d3f-8350-62954e26eb83 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1613.655323] env[67893]: DEBUG oslo_vmware.api [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Task: {'id': task-3455452, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068258} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1613.655588] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1613.655787] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1613.655957] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1613.656146] env[67893]: INFO nova.compute.manager [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1613.658303] env[67893]: DEBUG nova.compute.claims [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1613.658478] env[67893]: DEBUG oslo_concurrency.lockutils [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1613.658703] env[67893]: DEBUG oslo_concurrency.lockutils [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1613.671463] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1613.784942] env[67893]: DEBUG oslo_vmware.rw_handles [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/83748bc2-bc5f-40d9-81d8-a7b9e336e812/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1613.846305] env[67893]: DEBUG oslo_vmware.rw_handles [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1613.846544] env[67893]: DEBUG oslo_vmware.rw_handles [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/83748bc2-bc5f-40d9-81d8-a7b9e336e812/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1613.858267] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1613.858441] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1613.944473] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e74c93eb-076a-4601-b412-8f83103500a1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1613.952329] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d107f0e-6251-4fe0-9d0c-5d60a07ebdaa {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1613.981929] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f95c306f-b484-42cf-87fc-d78c4c375cd3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1613.988530] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24e169d0-23e1-40ab-89f6-b5838d0ee0b9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1614.000822] env[67893]: DEBUG nova.compute.provider_tree [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1614.008681] env[67893]: DEBUG nova.scheduler.client.report [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1614.022321] env[67893]: DEBUG oslo_concurrency.lockutils [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.364s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1614.022869] env[67893]: ERROR nova.compute.manager [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1614.022869] env[67893]: Faults: ['InvalidArgument'] [ 1614.022869] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Traceback (most recent call last): [ 1614.022869] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1614.022869] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] self.driver.spawn(context, instance, image_meta, [ 1614.022869] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1614.022869] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1614.022869] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1614.022869] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] self._fetch_image_if_missing(context, vi) [ 1614.022869] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1614.022869] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] image_cache(vi, tmp_image_ds_loc) [ 1614.022869] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1614.022869] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] vm_util.copy_virtual_disk( [ 1614.022869] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1614.022869] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] session._wait_for_task(vmdk_copy_task) [ 1614.022869] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1614.022869] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] return self.wait_for_task(task_ref) [ 1614.022869] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1614.022869] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] return evt.wait() [ 1614.022869] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1614.022869] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] result = hub.switch() [ 1614.022869] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1614.022869] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] return self.greenlet.switch() [ 1614.022869] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1614.022869] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] self.f(*self.args, **self.kw) [ 1614.022869] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1614.022869] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] raise exceptions.translate_fault(task_info.error) [ 1614.022869] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1614.022869] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Faults: ['InvalidArgument'] [ 1614.022869] env[67893]: ERROR nova.compute.manager [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] [ 1614.023800] env[67893]: DEBUG nova.compute.utils [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1614.024891] env[67893]: DEBUG nova.compute.manager [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Build of instance efdb0a7e-403d-4de5-8c09-72b9c8f9cd79 was re-scheduled: A specified parameter was not correct: fileType [ 1614.024891] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1614.025257] env[67893]: DEBUG nova.compute.manager [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1614.025431] env[67893]: DEBUG nova.compute.manager [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1614.025600] env[67893]: DEBUG nova.compute.manager [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1614.025762] env[67893]: DEBUG nova.network.neutron [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1614.551813] env[67893]: DEBUG nova.network.neutron [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1614.569022] env[67893]: INFO nova.compute.manager [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Took 0.54 seconds to deallocate network for instance. [ 1614.668098] env[67893]: INFO nova.scheduler.client.report [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Deleted allocations for instance efdb0a7e-403d-4de5-8c09-72b9c8f9cd79 [ 1614.690299] env[67893]: DEBUG oslo_concurrency.lockutils [None req-91626aa5-5770-48b7-9e20-d4c61c396568 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Lock "efdb0a7e-403d-4de5-8c09-72b9c8f9cd79" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 675.801s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1614.691527] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d1843f30-62a7-4738-b1fa-eb5b96f0b761 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Lock "efdb0a7e-403d-4de5-8c09-72b9c8f9cd79" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 479.466s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1614.691730] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d1843f30-62a7-4738-b1fa-eb5b96f0b761 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Acquiring lock "efdb0a7e-403d-4de5-8c09-72b9c8f9cd79-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1614.691970] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d1843f30-62a7-4738-b1fa-eb5b96f0b761 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Lock "efdb0a7e-403d-4de5-8c09-72b9c8f9cd79-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1614.692122] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d1843f30-62a7-4738-b1fa-eb5b96f0b761 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Lock "efdb0a7e-403d-4de5-8c09-72b9c8f9cd79-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1614.694125] env[67893]: INFO nova.compute.manager [None req-d1843f30-62a7-4738-b1fa-eb5b96f0b761 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Terminating instance [ 1614.695912] env[67893]: DEBUG nova.compute.manager [None req-d1843f30-62a7-4738-b1fa-eb5b96f0b761 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1614.696126] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-d1843f30-62a7-4738-b1fa-eb5b96f0b761 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1614.696612] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a916e5c6-edca-4b29-abfb-e83b27edf1f2 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1614.702302] env[67893]: DEBUG nova.compute.manager [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1614.709168] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-885f2aaa-2238-4482-8b89-a4d7e01f8da6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1614.738524] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-d1843f30-62a7-4738-b1fa-eb5b96f0b761 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance efdb0a7e-403d-4de5-8c09-72b9c8f9cd79 could not be found. [ 1614.738746] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-d1843f30-62a7-4738-b1fa-eb5b96f0b761 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1614.738930] env[67893]: INFO nova.compute.manager [None req-d1843f30-62a7-4738-b1fa-eb5b96f0b761 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1614.739197] env[67893]: DEBUG oslo.service.loopingcall [None req-d1843f30-62a7-4738-b1fa-eb5b96f0b761 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1614.741581] env[67893]: DEBUG nova.compute.manager [-] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1614.741696] env[67893]: DEBUG nova.network.neutron [-] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1614.755735] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1614.756017] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1614.757528] env[67893]: INFO nova.compute.claims [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1614.768531] env[67893]: DEBUG nova.network.neutron [-] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1614.776557] env[67893]: INFO nova.compute.manager [-] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] Took 0.03 seconds to deallocate network for instance. [ 1614.881535] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d1843f30-62a7-4738-b1fa-eb5b96f0b761 tempest-ImagesNegativeTestJSON-1391670457 tempest-ImagesNegativeTestJSON-1391670457-project-member] Lock "efdb0a7e-403d-4de5-8c09-72b9c8f9cd79" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.190s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1614.883032] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "efdb0a7e-403d-4de5-8c09-72b9c8f9cd79" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 424.071s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1614.883032] env[67893]: INFO nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: efdb0a7e-403d-4de5-8c09-72b9c8f9cd79] During sync_power_state the instance has a pending task (deleting). Skip. [ 1614.883032] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "efdb0a7e-403d-4de5-8c09-72b9c8f9cd79" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1614.991524] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21b3583c-e305-4d03-b23e-b58711539c31 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1614.999200] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79ab8e72-ae25-4c1d-8301-a0bea59a15e6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1615.028954] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35c05ddf-4863-4e72-91fe-3f0c6f2b93cd {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1615.035910] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-582d3597-71a0-4c03-803a-4e2634ff90a7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1615.048577] env[67893]: DEBUG nova.compute.provider_tree [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1615.058877] env[67893]: DEBUG nova.scheduler.client.report [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1615.071724] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.316s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1615.072202] env[67893]: DEBUG nova.compute.manager [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1615.103497] env[67893]: DEBUG nova.compute.utils [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1615.104794] env[67893]: DEBUG nova.compute.manager [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1615.104968] env[67893]: DEBUG nova.network.neutron [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1615.116319] env[67893]: DEBUG nova.compute.manager [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1615.169782] env[67893]: DEBUG nova.policy [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7d5bfbf0d1dc4fc49c5abd78a4d5b60d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '88e962e37cb548cc8c544ac0669fcef6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 1615.178642] env[67893]: DEBUG nova.compute.manager [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1615.203522] env[67893]: DEBUG nova.virt.hardware [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1615.203780] env[67893]: DEBUG nova.virt.hardware [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1615.203945] env[67893]: DEBUG nova.virt.hardware [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1615.204142] env[67893]: DEBUG nova.virt.hardware [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1615.204291] env[67893]: DEBUG nova.virt.hardware [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1615.204439] env[67893]: DEBUG nova.virt.hardware [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1615.204647] env[67893]: DEBUG nova.virt.hardware [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1615.204809] env[67893]: DEBUG nova.virt.hardware [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1615.204976] env[67893]: DEBUG nova.virt.hardware [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1615.205151] env[67893]: DEBUG nova.virt.hardware [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1615.205409] env[67893]: DEBUG nova.virt.hardware [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1615.206544] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4563eb62-8598-4643-8710-36e5b44ab86f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1615.214350] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2287370d-2134-4467-b132-239d162675a4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1615.518013] env[67893]: DEBUG nova.network.neutron [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Successfully created port: 3a5f82de-f83e-4b4a-943d-c1dd38dbd2bb {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1616.224777] env[67893]: DEBUG nova.network.neutron [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Successfully updated port: 3a5f82de-f83e-4b4a-943d-c1dd38dbd2bb {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1616.237858] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Acquiring lock "refresh_cache-dfb92d1c-c2a5-49c1-8526-3743cb385c97" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1616.238258] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Acquired lock "refresh_cache-dfb92d1c-c2a5-49c1-8526-3743cb385c97" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1616.238258] env[67893]: DEBUG nova.network.neutron [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1616.277129] env[67893]: DEBUG nova.network.neutron [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1616.511312] env[67893]: DEBUG nova.network.neutron [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Updating instance_info_cache with network_info: [{"id": "3a5f82de-f83e-4b4a-943d-c1dd38dbd2bb", "address": "fa:16:3e:0a:2d:ff", "network": {"id": "a3c83718-ffc5-4aec-b99a-903fc31edaff", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-147544735-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "88e962e37cb548cc8c544ac0669fcef6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eaba65c3-6925-4c7f-83b6-17cd1a328e27", "external-id": "nsx-vlan-transportzone-202", "segmentation_id": 202, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3a5f82de-f8", "ovs_interfaceid": "3a5f82de-f83e-4b4a-943d-c1dd38dbd2bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1616.522555] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Releasing lock "refresh_cache-dfb92d1c-c2a5-49c1-8526-3743cb385c97" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1616.522834] env[67893]: DEBUG nova.compute.manager [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Instance network_info: |[{"id": "3a5f82de-f83e-4b4a-943d-c1dd38dbd2bb", "address": "fa:16:3e:0a:2d:ff", "network": {"id": "a3c83718-ffc5-4aec-b99a-903fc31edaff", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-147544735-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "88e962e37cb548cc8c544ac0669fcef6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eaba65c3-6925-4c7f-83b6-17cd1a328e27", "external-id": "nsx-vlan-transportzone-202", "segmentation_id": 202, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3a5f82de-f8", "ovs_interfaceid": "3a5f82de-f83e-4b4a-943d-c1dd38dbd2bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1616.523485] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:0a:2d:ff', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'eaba65c3-6925-4c7f-83b6-17cd1a328e27', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3a5f82de-f83e-4b4a-943d-c1dd38dbd2bb', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1616.530531] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Creating folder: Project (88e962e37cb548cc8c544ac0669fcef6). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1616.531039] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-28a22f42-8b2b-4197-80c1-eae510b528a4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1616.542094] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Created folder: Project (88e962e37cb548cc8c544ac0669fcef6) in parent group-v689771. [ 1616.542282] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Creating folder: Instances. Parent ref: group-v689862. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1616.542497] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f8f92ea2-da06-4652-8a9b-7b4eb338c8df {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1616.551495] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Created folder: Instances in parent group-v689862. [ 1616.551495] env[67893]: DEBUG oslo.service.loopingcall [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1616.551669] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1616.551721] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-cd3b02d1-7421-40ca-b3be-e423e5863343 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1616.571110] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1616.571110] env[67893]: value = "task-3455455" [ 1616.571110] env[67893]: _type = "Task" [ 1616.571110] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1616.578275] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455455, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1616.590555] env[67893]: DEBUG nova.compute.manager [req-1d152356-13ea-4512-bd53-3c82bce3fad1 req-a915f7e9-fdf2-4328-bea0-0d24a910bb81 service nova] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Received event network-vif-plugged-3a5f82de-f83e-4b4a-943d-c1dd38dbd2bb {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1616.590756] env[67893]: DEBUG oslo_concurrency.lockutils [req-1d152356-13ea-4512-bd53-3c82bce3fad1 req-a915f7e9-fdf2-4328-bea0-0d24a910bb81 service nova] Acquiring lock "dfb92d1c-c2a5-49c1-8526-3743cb385c97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1616.590968] env[67893]: DEBUG oslo_concurrency.lockutils [req-1d152356-13ea-4512-bd53-3c82bce3fad1 req-a915f7e9-fdf2-4328-bea0-0d24a910bb81 service nova] Lock "dfb92d1c-c2a5-49c1-8526-3743cb385c97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1616.591131] env[67893]: DEBUG oslo_concurrency.lockutils [req-1d152356-13ea-4512-bd53-3c82bce3fad1 req-a915f7e9-fdf2-4328-bea0-0d24a910bb81 service nova] Lock "dfb92d1c-c2a5-49c1-8526-3743cb385c97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1616.591298] env[67893]: DEBUG nova.compute.manager [req-1d152356-13ea-4512-bd53-3c82bce3fad1 req-a915f7e9-fdf2-4328-bea0-0d24a910bb81 service nova] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] No waiting events found dispatching network-vif-plugged-3a5f82de-f83e-4b4a-943d-c1dd38dbd2bb {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1616.591449] env[67893]: WARNING nova.compute.manager [req-1d152356-13ea-4512-bd53-3c82bce3fad1 req-a915f7e9-fdf2-4328-bea0-0d24a910bb81 service nova] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Received unexpected event network-vif-plugged-3a5f82de-f83e-4b4a-943d-c1dd38dbd2bb for instance with vm_state building and task_state spawning. [ 1616.591641] env[67893]: DEBUG nova.compute.manager [req-1d152356-13ea-4512-bd53-3c82bce3fad1 req-a915f7e9-fdf2-4328-bea0-0d24a910bb81 service nova] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Received event network-changed-3a5f82de-f83e-4b4a-943d-c1dd38dbd2bb {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1616.591822] env[67893]: DEBUG nova.compute.manager [req-1d152356-13ea-4512-bd53-3c82bce3fad1 req-a915f7e9-fdf2-4328-bea0-0d24a910bb81 service nova] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Refreshing instance network info cache due to event network-changed-3a5f82de-f83e-4b4a-943d-c1dd38dbd2bb. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1616.592016] env[67893]: DEBUG oslo_concurrency.lockutils [req-1d152356-13ea-4512-bd53-3c82bce3fad1 req-a915f7e9-fdf2-4328-bea0-0d24a910bb81 service nova] Acquiring lock "refresh_cache-dfb92d1c-c2a5-49c1-8526-3743cb385c97" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1616.592156] env[67893]: DEBUG oslo_concurrency.lockutils [req-1d152356-13ea-4512-bd53-3c82bce3fad1 req-a915f7e9-fdf2-4328-bea0-0d24a910bb81 service nova] Acquired lock "refresh_cache-dfb92d1c-c2a5-49c1-8526-3743cb385c97" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1616.592307] env[67893]: DEBUG nova.network.neutron [req-1d152356-13ea-4512-bd53-3c82bce3fad1 req-a915f7e9-fdf2-4328-bea0-0d24a910bb81 service nova] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Refreshing network info cache for port 3a5f82de-f83e-4b4a-943d-c1dd38dbd2bb {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1616.858839] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1616.871366] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1616.871578] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1616.871741] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1616.871893] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1616.872972] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a595de5-22f0-4fc4-a4cf-74904c1bcaf7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1616.880463] env[67893]: DEBUG nova.network.neutron [req-1d152356-13ea-4512-bd53-3c82bce3fad1 req-a915f7e9-fdf2-4328-bea0-0d24a910bb81 service nova] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Updated VIF entry in instance network info cache for port 3a5f82de-f83e-4b4a-943d-c1dd38dbd2bb. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1616.880784] env[67893]: DEBUG nova.network.neutron [req-1d152356-13ea-4512-bd53-3c82bce3fad1 req-a915f7e9-fdf2-4328-bea0-0d24a910bb81 service nova] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Updating instance_info_cache with network_info: [{"id": "3a5f82de-f83e-4b4a-943d-c1dd38dbd2bb", "address": "fa:16:3e:0a:2d:ff", "network": {"id": "a3c83718-ffc5-4aec-b99a-903fc31edaff", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-147544735-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "88e962e37cb548cc8c544ac0669fcef6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eaba65c3-6925-4c7f-83b6-17cd1a328e27", "external-id": "nsx-vlan-transportzone-202", "segmentation_id": 202, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3a5f82de-f8", "ovs_interfaceid": "3a5f82de-f83e-4b4a-943d-c1dd38dbd2bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1616.882695] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dda08baf-f862-4a29-813b-219725482b41 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1616.897040] env[67893]: DEBUG oslo_concurrency.lockutils [req-1d152356-13ea-4512-bd53-3c82bce3fad1 req-a915f7e9-fdf2-4328-bea0-0d24a910bb81 service nova] Releasing lock "refresh_cache-dfb92d1c-c2a5-49c1-8526-3743cb385c97" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1616.897808] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2a83419-b796-4ea9-a924-4d2a13bacc5c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1616.904283] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5eeba243-249e-44bd-9688-fb14bb26fc4d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1616.932927] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180963MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1616.933067] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1616.933267] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1617.001203] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 5ede1991-efee-4c34-af5b-ce71f67456ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1617.001372] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance b3d31ca3-9a7a-49d0-955f-1e12808bf11f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1617.001500] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 8dbbc2e6-9993-4bf0-b66b-6e685789221c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1617.001623] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1617.001757] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 021f1a86-6015-4a22-b501-3ec9079edbec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1617.001876] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 25d67f98-c132-434b-9d22-4569585527eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1617.001992] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 41b5c5ec-936a-4abe-9db7-38d0d2aa371d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1617.002122] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance e1849daf-3781-42ef-bede-267efbb652c9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1617.002236] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2875b0a3-0213-4908-b86b-ce45a8901553 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1617.002346] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance dfb92d1c-c2a5-49c1-8526-3743cb385c97 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1617.012875] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance ad60df35-54c0-459e-8a25-981922ae0a88 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1617.022921] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance b2c40a66-699c-4185-8ffa-85dbfc4463c5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1617.033407] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 1c6116c0-84c2-40bd-84d2-bf1f4a5b9a10 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1617.043724] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 9fc9f6b0-928e-46b4-ad7c-9217b2f31575 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1617.043944] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1617.044101] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1617.083597] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455455, 'name': CreateVM_Task, 'duration_secs': 0.285798} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1617.084442] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1617.084637] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1617.084805] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1617.085136] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1617.085393] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-997775ea-df92-4bc1-b616-7c3f36556bea {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1617.092434] env[67893]: DEBUG oslo_vmware.api [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Waiting for the task: (returnval){ [ 1617.092434] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52bcc70f-92b7-92cb-7e9a-92ee4ea5d3fc" [ 1617.092434] env[67893]: _type = "Task" [ 1617.092434] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1617.100015] env[67893]: DEBUG oslo_vmware.api [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52bcc70f-92b7-92cb-7e9a-92ee4ea5d3fc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1617.226627] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6091c511-1ad9-4b44-98d1-835bab617358 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1617.234368] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fdde2fea-2036-419e-9d86-166cd3d624ad {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1617.263670] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d86dd958-e5bf-49da-bfb1-39874eb19e39 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1617.270825] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38bb9b79-f9b4-449e-a62a-378d4e2bdfd2 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1617.284180] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1617.292992] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1617.306333] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1617.306551] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.373s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1617.602194] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1617.602449] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1617.602660] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1633.631405] env[67893]: DEBUG oslo_concurrency.lockutils [None req-fd9152e3-d398-4d70-b0a6-7a3f372f9552 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquiring lock "2875b0a3-0213-4908-b86b-ce45a8901553" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1646.882053] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Acquiring lock "15893b5f-a02a-4ce7-80c9-eea0658f9ac7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1646.882403] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Lock "15893b5f-a02a-4ce7-80c9-eea0658f9ac7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1653.983252] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d7d99f6e-fbaa-4229-aea0-700e12581091 tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Acquiring lock "dfb92d1c-c2a5-49c1-8526-3743cb385c97" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1662.547167] env[67893]: WARNING oslo_vmware.rw_handles [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1662.547167] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1662.547167] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1662.547167] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1662.547167] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1662.547167] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 1662.547167] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1662.547167] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1662.547167] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1662.547167] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1662.547167] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1662.547167] env[67893]: ERROR oslo_vmware.rw_handles [ 1662.547768] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/83748bc2-bc5f-40d9-81d8-a7b9e336e812/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1662.549997] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1662.553967] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Copying Virtual Disk [datastore1] vmware_temp/83748bc2-bc5f-40d9-81d8-a7b9e336e812/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/83748bc2-bc5f-40d9-81d8-a7b9e336e812/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1662.554322] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-2743976f-d516-454c-a136-0db03a2d587b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1662.562603] env[67893]: DEBUG oslo_vmware.api [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Waiting for the task: (returnval){ [ 1662.562603] env[67893]: value = "task-3455456" [ 1662.562603] env[67893]: _type = "Task" [ 1662.562603] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1662.570789] env[67893]: DEBUG oslo_vmware.api [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Task: {'id': task-3455456, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1663.072651] env[67893]: DEBUG oslo_vmware.exceptions [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1663.072984] env[67893]: DEBUG oslo_concurrency.lockutils [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1663.073525] env[67893]: ERROR nova.compute.manager [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1663.073525] env[67893]: Faults: ['InvalidArgument'] [ 1663.073525] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Traceback (most recent call last): [ 1663.073525] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1663.073525] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] yield resources [ 1663.073525] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1663.073525] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] self.driver.spawn(context, instance, image_meta, [ 1663.073525] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1663.073525] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1663.073525] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1663.073525] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] self._fetch_image_if_missing(context, vi) [ 1663.073525] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1663.073525] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] image_cache(vi, tmp_image_ds_loc) [ 1663.073525] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1663.073525] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] vm_util.copy_virtual_disk( [ 1663.073525] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1663.073525] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] session._wait_for_task(vmdk_copy_task) [ 1663.073525] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1663.073525] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] return self.wait_for_task(task_ref) [ 1663.073525] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1663.073525] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] return evt.wait() [ 1663.073525] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1663.073525] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] result = hub.switch() [ 1663.073525] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1663.073525] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] return self.greenlet.switch() [ 1663.073525] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1663.073525] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] self.f(*self.args, **self.kw) [ 1663.073525] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1663.073525] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] raise exceptions.translate_fault(task_info.error) [ 1663.073525] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1663.073525] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Faults: ['InvalidArgument'] [ 1663.073525] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] [ 1663.074386] env[67893]: INFO nova.compute.manager [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Terminating instance [ 1663.075823] env[67893]: DEBUG oslo_concurrency.lockutils [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1663.075823] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1663.075823] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a03169db-97f8-4ff0-86dd-0c8360060240 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1663.079327] env[67893]: DEBUG nova.compute.manager [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1663.079517] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1663.080249] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41326847-a06b-43e8-a745-413f2591284e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1663.086621] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1663.086854] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-8f8d504c-c370-4a48-8b32-ed5b54aa6227 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1663.088948] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1663.089145] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1663.090052] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-208dc4cb-2cdb-400b-a39e-031911f4442b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1663.094671] env[67893]: DEBUG oslo_vmware.api [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Waiting for the task: (returnval){ [ 1663.094671] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]5220213a-b778-644b-a917-36cf76fc0eef" [ 1663.094671] env[67893]: _type = "Task" [ 1663.094671] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1663.101378] env[67893]: DEBUG oslo_vmware.api [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]5220213a-b778-644b-a917-36cf76fc0eef, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1663.153908] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1663.154129] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1663.154315] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Deleting the datastore file [datastore1] 5ede1991-efee-4c34-af5b-ce71f67456ef {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1663.154578] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d883f050-36a7-4e5b-b631-153e6cd466c6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1663.161044] env[67893]: DEBUG oslo_vmware.api [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Waiting for the task: (returnval){ [ 1663.161044] env[67893]: value = "task-3455458" [ 1663.161044] env[67893]: _type = "Task" [ 1663.161044] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1663.168426] env[67893]: DEBUG oslo_vmware.api [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Task: {'id': task-3455458, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1663.606171] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1663.606171] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Creating directory with path [datastore1] vmware_temp/100d2234-c4fa-4e40-a135-15156d109414/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1663.606171] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d460bb6c-60d8-4f10-8658-24344df7baf6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1663.617605] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Created directory with path [datastore1] vmware_temp/100d2234-c4fa-4e40-a135-15156d109414/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1663.617814] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Fetch image to [datastore1] vmware_temp/100d2234-c4fa-4e40-a135-15156d109414/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1663.618267] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/100d2234-c4fa-4e40-a135-15156d109414/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1663.618733] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e813da98-24ee-460b-809e-d2266c70529d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1663.625500] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e848adeb-e81d-463b-9db5-490ed85e7412 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1663.634412] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb21223d-e153-4a23-ae1a-b41a4ceb4fee {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1663.666536] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7fdae5b-c9b2-4710-85c0-5e1a3c5f4d8a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1663.673314] env[67893]: DEBUG oslo_vmware.api [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Task: {'id': task-3455458, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07527} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1663.674736] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1663.674931] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1663.675124] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1663.675302] env[67893]: INFO nova.compute.manager [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1663.677050] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f8779105-fb6e-47e1-bc80-db4d46247f5d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1663.678888] env[67893]: DEBUG nova.compute.claims [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1663.679106] env[67893]: DEBUG oslo_concurrency.lockutils [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1663.679332] env[67893]: DEBUG oslo_concurrency.lockutils [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1663.702914] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1663.752874] env[67893]: DEBUG oslo_vmware.rw_handles [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/100d2234-c4fa-4e40-a135-15156d109414/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1663.815631] env[67893]: DEBUG oslo_vmware.rw_handles [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1663.815818] env[67893]: DEBUG oslo_vmware.rw_handles [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/100d2234-c4fa-4e40-a135-15156d109414/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1663.942999] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-338b37ce-9fd4-4dfc-87a0-ac182d01ef64 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1663.950514] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5009dde4-2e4f-48e7-96ea-9bc10dede7e3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1663.980335] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7db9ea8c-289b-4e7c-ac09-0af38f98680d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1663.986723] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-660b8ceb-6815-49ae-a898-81a6a264c333 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1664.000513] env[67893]: DEBUG nova.compute.provider_tree [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1664.008662] env[67893]: DEBUG nova.scheduler.client.report [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1664.025384] env[67893]: DEBUG oslo_concurrency.lockutils [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.346s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1664.025924] env[67893]: ERROR nova.compute.manager [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1664.025924] env[67893]: Faults: ['InvalidArgument'] [ 1664.025924] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Traceback (most recent call last): [ 1664.025924] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1664.025924] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] self.driver.spawn(context, instance, image_meta, [ 1664.025924] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1664.025924] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1664.025924] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1664.025924] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] self._fetch_image_if_missing(context, vi) [ 1664.025924] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1664.025924] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] image_cache(vi, tmp_image_ds_loc) [ 1664.025924] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1664.025924] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] vm_util.copy_virtual_disk( [ 1664.025924] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1664.025924] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] session._wait_for_task(vmdk_copy_task) [ 1664.025924] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1664.025924] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] return self.wait_for_task(task_ref) [ 1664.025924] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1664.025924] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] return evt.wait() [ 1664.025924] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1664.025924] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] result = hub.switch() [ 1664.025924] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1664.025924] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] return self.greenlet.switch() [ 1664.025924] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1664.025924] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] self.f(*self.args, **self.kw) [ 1664.025924] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1664.025924] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] raise exceptions.translate_fault(task_info.error) [ 1664.025924] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1664.025924] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Faults: ['InvalidArgument'] [ 1664.025924] env[67893]: ERROR nova.compute.manager [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] [ 1664.026990] env[67893]: DEBUG nova.compute.utils [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1664.029205] env[67893]: DEBUG nova.compute.manager [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Build of instance 5ede1991-efee-4c34-af5b-ce71f67456ef was re-scheduled: A specified parameter was not correct: fileType [ 1664.029205] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1664.029205] env[67893]: DEBUG nova.compute.manager [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1664.029205] env[67893]: DEBUG nova.compute.manager [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1664.029205] env[67893]: DEBUG nova.compute.manager [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1664.029205] env[67893]: DEBUG nova.network.neutron [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1664.281449] env[67893]: DEBUG nova.network.neutron [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1664.294663] env[67893]: INFO nova.compute.manager [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Took 0.27 seconds to deallocate network for instance. [ 1664.396038] env[67893]: INFO nova.scheduler.client.report [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Deleted allocations for instance 5ede1991-efee-4c34-af5b-ce71f67456ef [ 1664.421171] env[67893]: DEBUG oslo_concurrency.lockutils [None req-aa521616-ed3e-4542-ac25-eb11a34f853d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "5ede1991-efee-4c34-af5b-ce71f67456ef" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 675.861s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1664.423030] env[67893]: DEBUG oslo_concurrency.lockutils [None req-e8f51f6a-95ae-4c4e-89dd-e71dcdb2c5a7 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "5ede1991-efee-4c34-af5b-ce71f67456ef" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 479.973s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1664.423114] env[67893]: DEBUG oslo_concurrency.lockutils [None req-e8f51f6a-95ae-4c4e-89dd-e71dcdb2c5a7 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "5ede1991-efee-4c34-af5b-ce71f67456ef-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1664.423300] env[67893]: DEBUG oslo_concurrency.lockutils [None req-e8f51f6a-95ae-4c4e-89dd-e71dcdb2c5a7 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "5ede1991-efee-4c34-af5b-ce71f67456ef-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1664.423506] env[67893]: DEBUG oslo_concurrency.lockutils [None req-e8f51f6a-95ae-4c4e-89dd-e71dcdb2c5a7 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "5ede1991-efee-4c34-af5b-ce71f67456ef-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1664.425558] env[67893]: INFO nova.compute.manager [None req-e8f51f6a-95ae-4c4e-89dd-e71dcdb2c5a7 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Terminating instance [ 1664.427388] env[67893]: DEBUG nova.compute.manager [None req-e8f51f6a-95ae-4c4e-89dd-e71dcdb2c5a7 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1664.427608] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-e8f51f6a-95ae-4c4e-89dd-e71dcdb2c5a7 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1664.428093] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b401b810-9c68-49f4-a06b-7fab256ff320 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1664.438177] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1c9e0ed-cc04-4557-9726-58271020cfc5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1664.449092] env[67893]: DEBUG nova.compute.manager [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1664.471137] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-e8f51f6a-95ae-4c4e-89dd-e71dcdb2c5a7 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 5ede1991-efee-4c34-af5b-ce71f67456ef could not be found. [ 1664.471359] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-e8f51f6a-95ae-4c4e-89dd-e71dcdb2c5a7 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1664.471540] env[67893]: INFO nova.compute.manager [None req-e8f51f6a-95ae-4c4e-89dd-e71dcdb2c5a7 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1664.471781] env[67893]: DEBUG oslo.service.loopingcall [None req-e8f51f6a-95ae-4c4e-89dd-e71dcdb2c5a7 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1664.472018] env[67893]: DEBUG nova.compute.manager [-] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1664.472122] env[67893]: DEBUG nova.network.neutron [-] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1664.495026] env[67893]: DEBUG nova.network.neutron [-] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1664.499741] env[67893]: DEBUG oslo_concurrency.lockutils [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1664.499975] env[67893]: DEBUG oslo_concurrency.lockutils [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1664.501704] env[67893]: INFO nova.compute.claims [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1664.504414] env[67893]: INFO nova.compute.manager [-] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] Took 0.03 seconds to deallocate network for instance. [ 1664.605974] env[67893]: DEBUG oslo_concurrency.lockutils [None req-e8f51f6a-95ae-4c4e-89dd-e71dcdb2c5a7 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "5ede1991-efee-4c34-af5b-ce71f67456ef" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.183s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1664.607496] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "5ede1991-efee-4c34-af5b-ce71f67456ef" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 473.796s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1664.607758] env[67893]: INFO nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 5ede1991-efee-4c34-af5b-ce71f67456ef] During sync_power_state the instance has a pending task (deleting). Skip. [ 1664.608469] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "5ede1991-efee-4c34-af5b-ce71f67456ef" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1664.707035] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21cbad3f-0b36-4c8d-9471-ed4ca0bea159 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1664.714782] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a98ec18e-5059-479f-bb94-70b4a6292645 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1664.745126] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c490176f-ca29-4336-84df-72ed2fd06cc1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1664.752252] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24c9f47e-53bb-4cea-b628-f70bacbe7e8b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1664.764973] env[67893]: DEBUG nova.compute.provider_tree [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1664.772984] env[67893]: DEBUG nova.scheduler.client.report [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1664.785821] env[67893]: DEBUG oslo_concurrency.lockutils [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.286s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1664.786276] env[67893]: DEBUG nova.compute.manager [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1664.819643] env[67893]: DEBUG nova.compute.utils [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1664.821217] env[67893]: DEBUG nova.compute.manager [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1664.821407] env[67893]: DEBUG nova.network.neutron [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1664.831123] env[67893]: DEBUG nova.compute.manager [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1664.889615] env[67893]: DEBUG nova.policy [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dd097cb06c5a4348a0a98a7d2705d877', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5fc182b40fde498abb43dacf19eed124', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 1664.909781] env[67893]: DEBUG nova.compute.manager [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1664.935388] env[67893]: DEBUG nova.virt.hardware [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1664.935700] env[67893]: DEBUG nova.virt.hardware [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1664.935822] env[67893]: DEBUG nova.virt.hardware [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1664.936017] env[67893]: DEBUG nova.virt.hardware [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1664.936182] env[67893]: DEBUG nova.virt.hardware [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1664.936331] env[67893]: DEBUG nova.virt.hardware [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1664.936538] env[67893]: DEBUG nova.virt.hardware [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1664.936698] env[67893]: DEBUG nova.virt.hardware [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1664.936864] env[67893]: DEBUG nova.virt.hardware [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1664.937062] env[67893]: DEBUG nova.virt.hardware [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1664.937244] env[67893]: DEBUG nova.virt.hardware [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1664.938108] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a78ed3e7-261a-45a4-848f-c544c345e087 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1664.946579] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fbce6fa-d1c8-4469-8417-3b1ce3ae172d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1665.238791] env[67893]: DEBUG nova.network.neutron [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Successfully created port: 7dfff7a0-c3a1-4cab-80ad-68da451a049c {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1665.307942] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1665.848868] env[67893]: DEBUG nova.network.neutron [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Successfully updated port: 7dfff7a0-c3a1-4cab-80ad-68da451a049c {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1665.860338] env[67893]: DEBUG oslo_concurrency.lockutils [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Acquiring lock "refresh_cache-ad60df35-54c0-459e-8a25-981922ae0a88" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1665.860470] env[67893]: DEBUG oslo_concurrency.lockutils [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Acquired lock "refresh_cache-ad60df35-54c0-459e-8a25-981922ae0a88" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1665.860628] env[67893]: DEBUG nova.network.neutron [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1665.898640] env[67893]: DEBUG nova.network.neutron [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1666.099312] env[67893]: DEBUG nova.network.neutron [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Updating instance_info_cache with network_info: [{"id": "7dfff7a0-c3a1-4cab-80ad-68da451a049c", "address": "fa:16:3e:61:d8:cb", "network": {"id": "a023126f-97b0-4ba2-b287-ea8176acba67", "bridge": "br-int", "label": "tempest-ImagesTestJSON-371591198-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5fc182b40fde498abb43dacf19eed124", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8c58d99d-ec12-4fc3-ab39-042b3f8cbb89", "external-id": "nsx-vlan-transportzone-44", "segmentation_id": 44, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7dfff7a0-c3", "ovs_interfaceid": "7dfff7a0-c3a1-4cab-80ad-68da451a049c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1666.113417] env[67893]: DEBUG oslo_concurrency.lockutils [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Releasing lock "refresh_cache-ad60df35-54c0-459e-8a25-981922ae0a88" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1666.113704] env[67893]: DEBUG nova.compute.manager [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Instance network_info: |[{"id": "7dfff7a0-c3a1-4cab-80ad-68da451a049c", "address": "fa:16:3e:61:d8:cb", "network": {"id": "a023126f-97b0-4ba2-b287-ea8176acba67", "bridge": "br-int", "label": "tempest-ImagesTestJSON-371591198-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5fc182b40fde498abb43dacf19eed124", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8c58d99d-ec12-4fc3-ab39-042b3f8cbb89", "external-id": "nsx-vlan-transportzone-44", "segmentation_id": 44, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7dfff7a0-c3", "ovs_interfaceid": "7dfff7a0-c3a1-4cab-80ad-68da451a049c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1666.114111] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:61:d8:cb', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8c58d99d-ec12-4fc3-ab39-042b3f8cbb89', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7dfff7a0-c3a1-4cab-80ad-68da451a049c', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1666.121856] env[67893]: DEBUG oslo.service.loopingcall [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1666.122424] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1666.122656] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-979394d0-6612-4281-8605-8c56c7ae5cf0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1666.143721] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1666.143721] env[67893]: value = "task-3455459" [ 1666.143721] env[67893]: _type = "Task" [ 1666.143721] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1666.152457] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455459, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1666.351551] env[67893]: DEBUG nova.compute.manager [req-58dd86fe-28a3-4fe6-ba29-22d06adf54bb req-034c95b7-27c6-4111-94f8-13224a0f9a5b service nova] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Received event network-vif-plugged-7dfff7a0-c3a1-4cab-80ad-68da451a049c {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1666.351816] env[67893]: DEBUG oslo_concurrency.lockutils [req-58dd86fe-28a3-4fe6-ba29-22d06adf54bb req-034c95b7-27c6-4111-94f8-13224a0f9a5b service nova] Acquiring lock "ad60df35-54c0-459e-8a25-981922ae0a88-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1666.352089] env[67893]: DEBUG oslo_concurrency.lockutils [req-58dd86fe-28a3-4fe6-ba29-22d06adf54bb req-034c95b7-27c6-4111-94f8-13224a0f9a5b service nova] Lock "ad60df35-54c0-459e-8a25-981922ae0a88-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1666.352293] env[67893]: DEBUG oslo_concurrency.lockutils [req-58dd86fe-28a3-4fe6-ba29-22d06adf54bb req-034c95b7-27c6-4111-94f8-13224a0f9a5b service nova] Lock "ad60df35-54c0-459e-8a25-981922ae0a88-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1666.352452] env[67893]: DEBUG nova.compute.manager [req-58dd86fe-28a3-4fe6-ba29-22d06adf54bb req-034c95b7-27c6-4111-94f8-13224a0f9a5b service nova] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] No waiting events found dispatching network-vif-plugged-7dfff7a0-c3a1-4cab-80ad-68da451a049c {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1666.352619] env[67893]: WARNING nova.compute.manager [req-58dd86fe-28a3-4fe6-ba29-22d06adf54bb req-034c95b7-27c6-4111-94f8-13224a0f9a5b service nova] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Received unexpected event network-vif-plugged-7dfff7a0-c3a1-4cab-80ad-68da451a049c for instance with vm_state building and task_state spawning. [ 1666.352844] env[67893]: DEBUG nova.compute.manager [req-58dd86fe-28a3-4fe6-ba29-22d06adf54bb req-034c95b7-27c6-4111-94f8-13224a0f9a5b service nova] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Received event network-changed-7dfff7a0-c3a1-4cab-80ad-68da451a049c {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1666.352987] env[67893]: DEBUG nova.compute.manager [req-58dd86fe-28a3-4fe6-ba29-22d06adf54bb req-034c95b7-27c6-4111-94f8-13224a0f9a5b service nova] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Refreshing instance network info cache due to event network-changed-7dfff7a0-c3a1-4cab-80ad-68da451a049c. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1666.353185] env[67893]: DEBUG oslo_concurrency.lockutils [req-58dd86fe-28a3-4fe6-ba29-22d06adf54bb req-034c95b7-27c6-4111-94f8-13224a0f9a5b service nova] Acquiring lock "refresh_cache-ad60df35-54c0-459e-8a25-981922ae0a88" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1666.353358] env[67893]: DEBUG oslo_concurrency.lockutils [req-58dd86fe-28a3-4fe6-ba29-22d06adf54bb req-034c95b7-27c6-4111-94f8-13224a0f9a5b service nova] Acquired lock "refresh_cache-ad60df35-54c0-459e-8a25-981922ae0a88" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1666.353535] env[67893]: DEBUG nova.network.neutron [req-58dd86fe-28a3-4fe6-ba29-22d06adf54bb req-034c95b7-27c6-4111-94f8-13224a0f9a5b service nova] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Refreshing network info cache for port 7dfff7a0-c3a1-4cab-80ad-68da451a049c {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1666.654180] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455459, 'name': CreateVM_Task, 'duration_secs': 0.359001} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1666.654378] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1666.655134] env[67893]: DEBUG oslo_concurrency.lockutils [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1666.655309] env[67893]: DEBUG oslo_concurrency.lockutils [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1666.655641] env[67893]: DEBUG oslo_concurrency.lockutils [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1666.655898] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e4a39cfc-f1be-41be-aff6-ab6d1735f7ce {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1666.660433] env[67893]: DEBUG oslo_vmware.api [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Waiting for the task: (returnval){ [ 1666.660433] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]526278b3-e1e8-2785-c4bf-e0e6db98f6d7" [ 1666.660433] env[67893]: _type = "Task" [ 1666.660433] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1666.669439] env[67893]: DEBUG oslo_vmware.api [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]526278b3-e1e8-2785-c4bf-e0e6db98f6d7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1666.841335] env[67893]: DEBUG nova.network.neutron [req-58dd86fe-28a3-4fe6-ba29-22d06adf54bb req-034c95b7-27c6-4111-94f8-13224a0f9a5b service nova] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Updated VIF entry in instance network info cache for port 7dfff7a0-c3a1-4cab-80ad-68da451a049c. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1666.841703] env[67893]: DEBUG nova.network.neutron [req-58dd86fe-28a3-4fe6-ba29-22d06adf54bb req-034c95b7-27c6-4111-94f8-13224a0f9a5b service nova] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Updating instance_info_cache with network_info: [{"id": "7dfff7a0-c3a1-4cab-80ad-68da451a049c", "address": "fa:16:3e:61:d8:cb", "network": {"id": "a023126f-97b0-4ba2-b287-ea8176acba67", "bridge": "br-int", "label": "tempest-ImagesTestJSON-371591198-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5fc182b40fde498abb43dacf19eed124", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8c58d99d-ec12-4fc3-ab39-042b3f8cbb89", "external-id": "nsx-vlan-transportzone-44", "segmentation_id": 44, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7dfff7a0-c3", "ovs_interfaceid": "7dfff7a0-c3a1-4cab-80ad-68da451a049c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1666.851071] env[67893]: DEBUG oslo_concurrency.lockutils [req-58dd86fe-28a3-4fe6-ba29-22d06adf54bb req-034c95b7-27c6-4111-94f8-13224a0f9a5b service nova] Releasing lock "refresh_cache-ad60df35-54c0-459e-8a25-981922ae0a88" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1667.173655] env[67893]: DEBUG oslo_concurrency.lockutils [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1667.173903] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1667.174131] env[67893]: DEBUG oslo_concurrency.lockutils [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1667.860064] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1667.860064] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1667.860064] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1667.881609] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1667.881774] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1667.881907] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1667.882046] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1667.882177] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1667.882324] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1667.882459] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1667.882583] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1667.882701] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1667.882817] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1667.882939] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1668.475795] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dc4e829b-0b43-47eb-b00f-32763c950794 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Acquiring lock "ad60df35-54c0-459e-8a25-981922ae0a88" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1669.858727] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1672.858585] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1673.853675] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1673.853868] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1673.875621] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1673.875968] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1673.875968] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1673.876124] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1677.859537] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1677.871844] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1677.872064] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1677.872236] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1677.872390] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1677.873895] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4465ec9-de57-4e22-a127-079072f2960e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1677.882413] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e88b17e5-68ad-4af4-bb42-328fd8345cf4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1677.897275] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb2d257a-cbeb-439b-a186-763e2b56a3b8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1677.903350] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d82822bc-8105-44f6-a4da-5eb40bf18add {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1677.931784] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180965MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1677.931949] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1677.932165] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1678.010490] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance b3d31ca3-9a7a-49d0-955f-1e12808bf11f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1678.010664] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 8dbbc2e6-9993-4bf0-b66b-6e685789221c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1678.010794] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1678.010920] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 021f1a86-6015-4a22-b501-3ec9079edbec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1678.011058] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 25d67f98-c132-434b-9d22-4569585527eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1678.011184] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 41b5c5ec-936a-4abe-9db7-38d0d2aa371d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1678.011362] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance e1849daf-3781-42ef-bede-267efbb652c9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1678.011410] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2875b0a3-0213-4908-b86b-ce45a8901553 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1678.011515] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance dfb92d1c-c2a5-49c1-8526-3743cb385c97 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1678.011674] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance ad60df35-54c0-459e-8a25-981922ae0a88 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1678.022712] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance b2c40a66-699c-4185-8ffa-85dbfc4463c5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1678.032670] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 1c6116c0-84c2-40bd-84d2-bf1f4a5b9a10 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1678.041707] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 9fc9f6b0-928e-46b4-ad7c-9217b2f31575 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1678.051388] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 15893b5f-a02a-4ce7-80c9-eea0658f9ac7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1678.051607] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1678.051752] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1678.239414] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41917277-c928-46d4-bf33-262258a25266 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1678.246984] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-093d6c8e-0cd0-41d5-9498-e68c44c783a1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1678.277890] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-585c6c6a-791a-4270-8142-e1aa1a8222ea {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1678.285128] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-082ba4f1-ab42-4f87-a71a-9cd1523197d7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1678.298571] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1678.307234] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1678.323744] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1678.323923] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.392s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1682.430344] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Acquiring lock "94760898-4f3c-4f41-85be-366f4108d0ba" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1682.430651] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Lock "94760898-4f3c-4f41-85be-366f4108d0ba" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1686.600999] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquiring lock "11000d92-0094-4561-a807-ca76610ea549" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1686.601293] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Lock "11000d92-0094-4561-a807-ca76610ea549" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1710.933240] env[67893]: WARNING oslo_vmware.rw_handles [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1710.933240] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1710.933240] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1710.933240] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1710.933240] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1710.933240] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 1710.933240] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1710.933240] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1710.933240] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1710.933240] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1710.933240] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1710.933240] env[67893]: ERROR oslo_vmware.rw_handles [ 1710.934241] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/100d2234-c4fa-4e40-a135-15156d109414/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1710.935650] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1710.936015] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Copying Virtual Disk [datastore1] vmware_temp/100d2234-c4fa-4e40-a135-15156d109414/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/100d2234-c4fa-4e40-a135-15156d109414/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1710.936413] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-5cb9a976-0376-4e2f-ac23-eec5a8f23051 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1710.946203] env[67893]: DEBUG oslo_vmware.api [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Waiting for the task: (returnval){ [ 1710.946203] env[67893]: value = "task-3455460" [ 1710.946203] env[67893]: _type = "Task" [ 1710.946203] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1710.953746] env[67893]: DEBUG oslo_vmware.api [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Task: {'id': task-3455460, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1711.456615] env[67893]: DEBUG oslo_vmware.exceptions [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1711.457042] env[67893]: DEBUG oslo_concurrency.lockutils [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1711.457679] env[67893]: ERROR nova.compute.manager [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1711.457679] env[67893]: Faults: ['InvalidArgument'] [ 1711.457679] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Traceback (most recent call last): [ 1711.457679] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1711.457679] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] yield resources [ 1711.457679] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1711.457679] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] self.driver.spawn(context, instance, image_meta, [ 1711.457679] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1711.457679] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1711.457679] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1711.457679] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] self._fetch_image_if_missing(context, vi) [ 1711.457679] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1711.457679] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] image_cache(vi, tmp_image_ds_loc) [ 1711.457679] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1711.457679] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] vm_util.copy_virtual_disk( [ 1711.457679] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1711.457679] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] session._wait_for_task(vmdk_copy_task) [ 1711.457679] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1711.457679] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] return self.wait_for_task(task_ref) [ 1711.457679] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1711.457679] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] return evt.wait() [ 1711.457679] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1711.457679] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] result = hub.switch() [ 1711.457679] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1711.457679] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] return self.greenlet.switch() [ 1711.457679] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1711.457679] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] self.f(*self.args, **self.kw) [ 1711.457679] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1711.457679] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] raise exceptions.translate_fault(task_info.error) [ 1711.457679] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1711.457679] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Faults: ['InvalidArgument'] [ 1711.457679] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] [ 1711.458692] env[67893]: INFO nova.compute.manager [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Terminating instance [ 1711.459608] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1711.459836] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1711.460089] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-73004c44-c6dc-4300-9403-47409ca49fdc {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1711.463693] env[67893]: DEBUG nova.compute.manager [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1711.463889] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1711.464631] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8c3c668-4d93-4df7-bf5a-f9f5575a91da {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1711.471029] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1711.471310] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-65066254-2976-4395-93ac-613d1b4aa0ef {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1711.473428] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1711.473597] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1711.474586] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b8e7f7b0-b2b6-4e09-8780-5d1e06c83cb9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1711.479236] env[67893]: DEBUG oslo_vmware.api [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Waiting for the task: (returnval){ [ 1711.479236] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52ca98b2-508f-a911-2d63-f83b258a4026" [ 1711.479236] env[67893]: _type = "Task" [ 1711.479236] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1711.485958] env[67893]: DEBUG oslo_vmware.api [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52ca98b2-508f-a911-2d63-f83b258a4026, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1711.919957] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1711.920200] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1711.920395] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Deleting the datastore file [datastore1] b3d31ca3-9a7a-49d0-955f-1e12808bf11f {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1711.920657] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-648dcd68-c5f1-43bb-a56d-874c54ada3b8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1711.927259] env[67893]: DEBUG oslo_vmware.api [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Waiting for the task: (returnval){ [ 1711.927259] env[67893]: value = "task-3455462" [ 1711.927259] env[67893]: _type = "Task" [ 1711.927259] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1711.934898] env[67893]: DEBUG oslo_vmware.api [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Task: {'id': task-3455462, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1711.989391] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1711.989654] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Creating directory with path [datastore1] vmware_temp/1674415b-1e23-47a3-ae34-f16e9062ebd0/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1711.989906] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1a656929-27c2-474a-89b5-d3eaf6945e00 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1712.008238] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Created directory with path [datastore1] vmware_temp/1674415b-1e23-47a3-ae34-f16e9062ebd0/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1712.008423] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Fetch image to [datastore1] vmware_temp/1674415b-1e23-47a3-ae34-f16e9062ebd0/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1712.008616] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/1674415b-1e23-47a3-ae34-f16e9062ebd0/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1712.009386] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99eda1a8-7ea7-4dd9-89dc-7d5efe22438b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1712.016106] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20eff880-2c3f-46b5-84a2-b283e3b773da {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1712.025207] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-162da750-1881-462c-a0b1-e5ed30d8b476 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1712.054596] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de03eb1c-1134-48fa-ba1d-fcaacb0008b4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1712.059929] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c721d7e4-6445-4e3c-b57c-2a08eb829fc4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1712.079136] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1712.129166] env[67893]: DEBUG oslo_vmware.rw_handles [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1674415b-1e23-47a3-ae34-f16e9062ebd0/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1712.189354] env[67893]: DEBUG oslo_vmware.rw_handles [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1712.189354] env[67893]: DEBUG oslo_vmware.rw_handles [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1674415b-1e23-47a3-ae34-f16e9062ebd0/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1712.439379] env[67893]: DEBUG oslo_vmware.api [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Task: {'id': task-3455462, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076926} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1712.439379] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1712.439379] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1712.439379] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1712.439379] env[67893]: INFO nova.compute.manager [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Took 0.97 seconds to destroy the instance on the hypervisor. [ 1712.440205] env[67893]: DEBUG nova.compute.claims [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1712.440387] env[67893]: DEBUG oslo_concurrency.lockutils [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1712.440644] env[67893]: DEBUG oslo_concurrency.lockutils [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1712.650318] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e1dc0ce-8952-4d40-b8a0-45ab509f15ff {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1712.657484] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d7e61f1-85de-41c7-b4b0-03b511fc9080 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1712.686222] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcc98103-c839-4166-aeda-bba9cfd2cb15 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1712.693287] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7a56967-5219-4664-9200-a36228f68ea6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1712.705931] env[67893]: DEBUG nova.compute.provider_tree [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1712.714874] env[67893]: DEBUG nova.scheduler.client.report [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1712.728354] env[67893]: DEBUG oslo_concurrency.lockutils [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.288s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1712.728880] env[67893]: ERROR nova.compute.manager [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1712.728880] env[67893]: Faults: ['InvalidArgument'] [ 1712.728880] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Traceback (most recent call last): [ 1712.728880] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1712.728880] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] self.driver.spawn(context, instance, image_meta, [ 1712.728880] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1712.728880] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1712.728880] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1712.728880] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] self._fetch_image_if_missing(context, vi) [ 1712.728880] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1712.728880] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] image_cache(vi, tmp_image_ds_loc) [ 1712.728880] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1712.728880] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] vm_util.copy_virtual_disk( [ 1712.728880] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1712.728880] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] session._wait_for_task(vmdk_copy_task) [ 1712.728880] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1712.728880] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] return self.wait_for_task(task_ref) [ 1712.728880] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1712.728880] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] return evt.wait() [ 1712.728880] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1712.728880] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] result = hub.switch() [ 1712.728880] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1712.728880] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] return self.greenlet.switch() [ 1712.728880] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1712.728880] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] self.f(*self.args, **self.kw) [ 1712.728880] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1712.728880] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] raise exceptions.translate_fault(task_info.error) [ 1712.728880] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1712.728880] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Faults: ['InvalidArgument'] [ 1712.728880] env[67893]: ERROR nova.compute.manager [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] [ 1712.730157] env[67893]: DEBUG nova.compute.utils [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1712.730985] env[67893]: DEBUG nova.compute.manager [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Build of instance b3d31ca3-9a7a-49d0-955f-1e12808bf11f was re-scheduled: A specified parameter was not correct: fileType [ 1712.730985] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1712.731374] env[67893]: DEBUG nova.compute.manager [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1712.731555] env[67893]: DEBUG nova.compute.manager [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1712.731755] env[67893]: DEBUG nova.compute.manager [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1712.731938] env[67893]: DEBUG nova.network.neutron [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1713.111620] env[67893]: DEBUG nova.network.neutron [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1713.127244] env[67893]: INFO nova.compute.manager [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Took 0.39 seconds to deallocate network for instance. [ 1713.233489] env[67893]: INFO nova.scheduler.client.report [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Deleted allocations for instance b3d31ca3-9a7a-49d0-955f-1e12808bf11f [ 1713.269831] env[67893]: DEBUG oslo_concurrency.lockutils [None req-441f74ab-dbfb-4117-bb86-50f4e959bbce tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "b3d31ca3-9a7a-49d0-955f-1e12808bf11f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 669.479s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1713.271011] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b35da648-c1db-48c5-8c47-853072dc757e tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "b3d31ca3-9a7a-49d0-955f-1e12808bf11f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 472.819s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1713.271264] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b35da648-c1db-48c5-8c47-853072dc757e tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquiring lock "b3d31ca3-9a7a-49d0-955f-1e12808bf11f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1713.271483] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b35da648-c1db-48c5-8c47-853072dc757e tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "b3d31ca3-9a7a-49d0-955f-1e12808bf11f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1713.271718] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b35da648-c1db-48c5-8c47-853072dc757e tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "b3d31ca3-9a7a-49d0-955f-1e12808bf11f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1713.274030] env[67893]: INFO nova.compute.manager [None req-b35da648-c1db-48c5-8c47-853072dc757e tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Terminating instance [ 1713.275879] env[67893]: DEBUG nova.compute.manager [None req-b35da648-c1db-48c5-8c47-853072dc757e tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1713.275879] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-b35da648-c1db-48c5-8c47-853072dc757e tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1713.276287] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5c7a57dc-ade0-4b04-ab0a-1cb121199cad {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1713.286439] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-306cc151-4a1d-419b-994d-fda4d9b5c88f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1713.298033] env[67893]: DEBUG nova.compute.manager [None req-a0e599db-ae9e-41ec-9dd4-8ad767b54843 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: b2c40a66-699c-4185-8ffa-85dbfc4463c5] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1713.319814] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-b35da648-c1db-48c5-8c47-853072dc757e tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b3d31ca3-9a7a-49d0-955f-1e12808bf11f could not be found. [ 1713.320050] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-b35da648-c1db-48c5-8c47-853072dc757e tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1713.320239] env[67893]: INFO nova.compute.manager [None req-b35da648-c1db-48c5-8c47-853072dc757e tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1713.320533] env[67893]: DEBUG oslo.service.loopingcall [None req-b35da648-c1db-48c5-8c47-853072dc757e tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1713.320999] env[67893]: DEBUG nova.compute.manager [-] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1713.320999] env[67893]: DEBUG nova.network.neutron [-] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1713.340213] env[67893]: DEBUG nova.compute.manager [None req-a0e599db-ae9e-41ec-9dd4-8ad767b54843 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: b2c40a66-699c-4185-8ffa-85dbfc4463c5] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1713.346951] env[67893]: DEBUG nova.network.neutron [-] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1713.355681] env[67893]: INFO nova.compute.manager [-] [instance: b3d31ca3-9a7a-49d0-955f-1e12808bf11f] Took 0.03 seconds to deallocate network for instance. [ 1713.363072] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a0e599db-ae9e-41ec-9dd4-8ad767b54843 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Lock "b2c40a66-699c-4185-8ffa-85dbfc4463c5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 222.725s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1713.374545] env[67893]: DEBUG nova.compute.manager [None req-fb4f8be8-000d-4eeb-b7f2-de178a5536ea tempest-ServersNegativeTestMultiTenantJSON-1847586781 tempest-ServersNegativeTestMultiTenantJSON-1847586781-project-member] [instance: 1c6116c0-84c2-40bd-84d2-bf1f4a5b9a10] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1713.423389] env[67893]: DEBUG nova.compute.manager [None req-fb4f8be8-000d-4eeb-b7f2-de178a5536ea tempest-ServersNegativeTestMultiTenantJSON-1847586781 tempest-ServersNegativeTestMultiTenantJSON-1847586781-project-member] [instance: 1c6116c0-84c2-40bd-84d2-bf1f4a5b9a10] Instance disappeared before build. {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1713.445567] env[67893]: DEBUG oslo_concurrency.lockutils [None req-fb4f8be8-000d-4eeb-b7f2-de178a5536ea tempest-ServersNegativeTestMultiTenantJSON-1847586781 tempest-ServersNegativeTestMultiTenantJSON-1847586781-project-member] Lock "1c6116c0-84c2-40bd-84d2-bf1f4a5b9a10" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 204.835s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1713.456906] env[67893]: DEBUG nova.compute.manager [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1713.521647] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b35da648-c1db-48c5-8c47-853072dc757e tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "b3d31ca3-9a7a-49d0-955f-1e12808bf11f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.251s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1713.547057] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1713.547416] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1713.548975] env[67893]: INFO nova.compute.claims [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1713.793760] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-618bea83-5c20-4015-8d2d-18f966352e2e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1713.801679] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e5a65a5-7d06-4cd2-91b0-a5d7863033b3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1713.835300] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cca30e66-fe63-4224-8436-37473a0cbf3a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1713.843053] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e9e4163-bc97-4b2a-a83b-50846d0f658b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1713.857600] env[67893]: DEBUG nova.compute.provider_tree [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1713.868738] env[67893]: DEBUG nova.scheduler.client.report [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1713.903738] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.356s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1713.904280] env[67893]: DEBUG nova.compute.manager [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1713.953581] env[67893]: DEBUG nova.compute.utils [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1713.955604] env[67893]: DEBUG nova.compute.manager [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1713.955796] env[67893]: DEBUG nova.network.neutron [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1713.968746] env[67893]: DEBUG nova.compute.manager [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1714.039053] env[67893]: DEBUG nova.compute.manager [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1714.057412] env[67893]: DEBUG nova.policy [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '27b8d53ae8fe4be49e764f9f03140309', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e3e07da634bd41dcbcc11ad1881be142', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 1714.071108] env[67893]: DEBUG nova.virt.hardware [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1714.071479] env[67893]: DEBUG nova.virt.hardware [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1714.071708] env[67893]: DEBUG nova.virt.hardware [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1714.071971] env[67893]: DEBUG nova.virt.hardware [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1714.072210] env[67893]: DEBUG nova.virt.hardware [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1714.072429] env[67893]: DEBUG nova.virt.hardware [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1714.072732] env[67893]: DEBUG nova.virt.hardware [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1714.072960] env[67893]: DEBUG nova.virt.hardware [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1714.073211] env[67893]: DEBUG nova.virt.hardware [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1714.073445] env[67893]: DEBUG nova.virt.hardware [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1714.073695] env[67893]: DEBUG nova.virt.hardware [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1714.074930] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5499947c-7756-4f0c-bcc4-2f8f2d57ea6e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1714.084813] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79903db9-cdaa-4d07-bdc4-bf4846a367b0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1714.586356] env[67893]: DEBUG nova.network.neutron [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Successfully created port: 63dbfc18-c188-4b63-a3ac-ce3866a3aa5a {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1715.148876] env[67893]: DEBUG nova.compute.manager [req-600a690e-9e7c-4075-981a-8ca11636813a req-58c05e6c-1385-4e26-8574-4c898aecb240 service nova] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Received event network-vif-plugged-63dbfc18-c188-4b63-a3ac-ce3866a3aa5a {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1715.149159] env[67893]: DEBUG oslo_concurrency.lockutils [req-600a690e-9e7c-4075-981a-8ca11636813a req-58c05e6c-1385-4e26-8574-4c898aecb240 service nova] Acquiring lock "9fc9f6b0-928e-46b4-ad7c-9217b2f31575-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1715.149402] env[67893]: DEBUG oslo_concurrency.lockutils [req-600a690e-9e7c-4075-981a-8ca11636813a req-58c05e6c-1385-4e26-8574-4c898aecb240 service nova] Lock "9fc9f6b0-928e-46b4-ad7c-9217b2f31575-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1715.149580] env[67893]: DEBUG oslo_concurrency.lockutils [req-600a690e-9e7c-4075-981a-8ca11636813a req-58c05e6c-1385-4e26-8574-4c898aecb240 service nova] Lock "9fc9f6b0-928e-46b4-ad7c-9217b2f31575-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1715.149751] env[67893]: DEBUG nova.compute.manager [req-600a690e-9e7c-4075-981a-8ca11636813a req-58c05e6c-1385-4e26-8574-4c898aecb240 service nova] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] No waiting events found dispatching network-vif-plugged-63dbfc18-c188-4b63-a3ac-ce3866a3aa5a {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1715.149920] env[67893]: WARNING nova.compute.manager [req-600a690e-9e7c-4075-981a-8ca11636813a req-58c05e6c-1385-4e26-8574-4c898aecb240 service nova] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Received unexpected event network-vif-plugged-63dbfc18-c188-4b63-a3ac-ce3866a3aa5a for instance with vm_state building and task_state spawning. [ 1715.159169] env[67893]: DEBUG nova.network.neutron [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Successfully updated port: 63dbfc18-c188-4b63-a3ac-ce3866a3aa5a {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1715.174255] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Acquiring lock "refresh_cache-9fc9f6b0-928e-46b4-ad7c-9217b2f31575" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1715.174418] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Acquired lock "refresh_cache-9fc9f6b0-928e-46b4-ad7c-9217b2f31575" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1715.174570] env[67893]: DEBUG nova.network.neutron [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1715.227567] env[67893]: DEBUG nova.network.neutron [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1715.452163] env[67893]: DEBUG nova.network.neutron [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Updating instance_info_cache with network_info: [{"id": "63dbfc18-c188-4b63-a3ac-ce3866a3aa5a", "address": "fa:16:3e:29:d5:68", "network": {"id": "f26023df-c6cc-4297-baa9-417e31b8a27b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-657445420-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e3e07da634bd41dcbcc11ad1881be142", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "06cc7c49-c46c-4c1e-bf51-77e9ea802c40", "external-id": "nsx-vlan-transportzone-450", "segmentation_id": 450, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap63dbfc18-c1", "ovs_interfaceid": "63dbfc18-c188-4b63-a3ac-ce3866a3aa5a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1715.467516] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Releasing lock "refresh_cache-9fc9f6b0-928e-46b4-ad7c-9217b2f31575" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1715.467855] env[67893]: DEBUG nova.compute.manager [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Instance network_info: |[{"id": "63dbfc18-c188-4b63-a3ac-ce3866a3aa5a", "address": "fa:16:3e:29:d5:68", "network": {"id": "f26023df-c6cc-4297-baa9-417e31b8a27b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-657445420-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e3e07da634bd41dcbcc11ad1881be142", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "06cc7c49-c46c-4c1e-bf51-77e9ea802c40", "external-id": "nsx-vlan-transportzone-450", "segmentation_id": 450, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap63dbfc18-c1", "ovs_interfaceid": "63dbfc18-c188-4b63-a3ac-ce3866a3aa5a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1715.468283] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:29:d5:68', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '06cc7c49-c46c-4c1e-bf51-77e9ea802c40', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '63dbfc18-c188-4b63-a3ac-ce3866a3aa5a', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1715.476446] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Creating folder: Project (e3e07da634bd41dcbcc11ad1881be142). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1715.477013] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-24205f0c-9729-4c0e-8fcc-91382b4a0208 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1715.488986] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Created folder: Project (e3e07da634bd41dcbcc11ad1881be142) in parent group-v689771. [ 1715.489224] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Creating folder: Instances. Parent ref: group-v689866. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1715.489508] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c5125c8f-a994-4db9-be76-3ea8a451c4fc {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1715.497802] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Created folder: Instances in parent group-v689866. [ 1715.498050] env[67893]: DEBUG oslo.service.loopingcall [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1715.498234] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1715.498432] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a2b0e97c-7036-4a32-b8ab-d81115c25fb3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1715.517537] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1715.517537] env[67893]: value = "task-3455465" [ 1715.517537] env[67893]: _type = "Task" [ 1715.517537] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1715.524759] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455465, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1716.027197] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455465, 'name': CreateVM_Task, 'duration_secs': 0.346426} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1716.027608] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1716.028099] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1716.028270] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1716.028595] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1716.028838] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2209db44-b9b9-4406-9fc9-aa1feb81d324 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1716.033011] env[67893]: DEBUG oslo_vmware.api [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Waiting for the task: (returnval){ [ 1716.033011] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52aea6ea-0cc1-4942-8e09-1cee1fc5bcc9" [ 1716.033011] env[67893]: _type = "Task" [ 1716.033011] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1716.040114] env[67893]: DEBUG oslo_vmware.api [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52aea6ea-0cc1-4942-8e09-1cee1fc5bcc9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1716.544020] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1716.544305] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1716.544624] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1717.174505] env[67893]: DEBUG nova.compute.manager [req-70ef9947-3689-4969-ae04-d5ce5cf56827 req-457e2289-658c-43cf-9153-ee8494193365 service nova] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Received event network-changed-63dbfc18-c188-4b63-a3ac-ce3866a3aa5a {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1717.174749] env[67893]: DEBUG nova.compute.manager [req-70ef9947-3689-4969-ae04-d5ce5cf56827 req-457e2289-658c-43cf-9153-ee8494193365 service nova] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Refreshing instance network info cache due to event network-changed-63dbfc18-c188-4b63-a3ac-ce3866a3aa5a. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1717.174851] env[67893]: DEBUG oslo_concurrency.lockutils [req-70ef9947-3689-4969-ae04-d5ce5cf56827 req-457e2289-658c-43cf-9153-ee8494193365 service nova] Acquiring lock "refresh_cache-9fc9f6b0-928e-46b4-ad7c-9217b2f31575" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1717.174995] env[67893]: DEBUG oslo_concurrency.lockutils [req-70ef9947-3689-4969-ae04-d5ce5cf56827 req-457e2289-658c-43cf-9153-ee8494193365 service nova] Acquired lock "refresh_cache-9fc9f6b0-928e-46b4-ad7c-9217b2f31575" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1717.175173] env[67893]: DEBUG nova.network.neutron [req-70ef9947-3689-4969-ae04-d5ce5cf56827 req-457e2289-658c-43cf-9153-ee8494193365 service nova] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Refreshing network info cache for port 63dbfc18-c188-4b63-a3ac-ce3866a3aa5a {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1717.469480] env[67893]: DEBUG nova.network.neutron [req-70ef9947-3689-4969-ae04-d5ce5cf56827 req-457e2289-658c-43cf-9153-ee8494193365 service nova] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Updated VIF entry in instance network info cache for port 63dbfc18-c188-4b63-a3ac-ce3866a3aa5a. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1717.469967] env[67893]: DEBUG nova.network.neutron [req-70ef9947-3689-4969-ae04-d5ce5cf56827 req-457e2289-658c-43cf-9153-ee8494193365 service nova] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Updating instance_info_cache with network_info: [{"id": "63dbfc18-c188-4b63-a3ac-ce3866a3aa5a", "address": "fa:16:3e:29:d5:68", "network": {"id": "f26023df-c6cc-4297-baa9-417e31b8a27b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-657445420-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e3e07da634bd41dcbcc11ad1881be142", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "06cc7c49-c46c-4c1e-bf51-77e9ea802c40", "external-id": "nsx-vlan-transportzone-450", "segmentation_id": 450, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap63dbfc18-c1", "ovs_interfaceid": "63dbfc18-c188-4b63-a3ac-ce3866a3aa5a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1717.479594] env[67893]: DEBUG oslo_concurrency.lockutils [req-70ef9947-3689-4969-ae04-d5ce5cf56827 req-457e2289-658c-43cf-9153-ee8494193365 service nova] Releasing lock "refresh_cache-9fc9f6b0-928e-46b4-ad7c-9217b2f31575" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1725.323778] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1729.860364] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1729.860679] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1729.860774] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1729.882993] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1729.883350] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1729.883509] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1729.883677] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1729.883829] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1729.883969] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1729.884139] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1729.884276] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1729.884425] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1729.884576] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1729.884716] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1731.859326] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1733.855740] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1733.858398] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1733.858597] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1733.858779] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1733.858926] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1735.859618] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1737.860389] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1737.874455] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1737.874685] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1737.874854] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1737.875024] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1737.876238] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e13b5c85-5073-4051-a359-daf1da341937 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1737.885518] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be2e1d6e-e9f7-42ef-b82c-90f09f4d519c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1737.899669] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a499890-fab7-4175-86ff-99266b3dd0c7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1737.905965] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d646ab5-0c01-4853-8a6a-853fdd7e8b09 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1737.934950] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180942MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1737.935127] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1737.935322] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1738.012396] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 8dbbc2e6-9993-4bf0-b66b-6e685789221c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1738.012585] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1738.012740] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 021f1a86-6015-4a22-b501-3ec9079edbec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1738.012867] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 25d67f98-c132-434b-9d22-4569585527eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1738.012988] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 41b5c5ec-936a-4abe-9db7-38d0d2aa371d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1738.013122] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance e1849daf-3781-42ef-bede-267efbb652c9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1738.013239] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2875b0a3-0213-4908-b86b-ce45a8901553 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1738.013350] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance dfb92d1c-c2a5-49c1-8526-3743cb385c97 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1738.013464] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance ad60df35-54c0-459e-8a25-981922ae0a88 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1738.013571] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 9fc9f6b0-928e-46b4-ad7c-9217b2f31575 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1738.024183] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 15893b5f-a02a-4ce7-80c9-eea0658f9ac7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1738.034263] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 94760898-4f3c-4f41-85be-366f4108d0ba has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1738.043462] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 11000d92-0094-4561-a807-ca76610ea549 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1738.043670] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1738.043817] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1738.186767] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82365f63-2f57-4df8-8890-61054f11ad7f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1738.194071] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-311d65a4-69f6-42aa-b8d6-163600282a5d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1738.224440] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-888240ee-1b21-4c6d-a613-a7768cc43faf {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1738.231308] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7db01266-b1b2-4c8b-bccb-ffb14084f6bb {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1738.244281] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1738.252903] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1738.265936] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1738.266151] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.331s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1759.963366] env[67893]: WARNING oslo_vmware.rw_handles [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1759.963366] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1759.963366] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1759.963366] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1759.963366] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1759.963366] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 1759.963366] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1759.963366] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1759.963366] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1759.963366] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1759.963366] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1759.963366] env[67893]: ERROR oslo_vmware.rw_handles [ 1759.963968] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/1674415b-1e23-47a3-ae34-f16e9062ebd0/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1759.965623] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1759.965874] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Copying Virtual Disk [datastore1] vmware_temp/1674415b-1e23-47a3-ae34-f16e9062ebd0/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/1674415b-1e23-47a3-ae34-f16e9062ebd0/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1759.966177] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-45a8ad3e-5f9b-4c08-966c-79d13f1526d6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1759.974884] env[67893]: DEBUG oslo_vmware.api [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Waiting for the task: (returnval){ [ 1759.974884] env[67893]: value = "task-3455466" [ 1759.974884] env[67893]: _type = "Task" [ 1759.974884] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1759.983346] env[67893]: DEBUG oslo_vmware.api [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Task: {'id': task-3455466, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1760.484862] env[67893]: DEBUG oslo_vmware.exceptions [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1760.485179] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1760.485732] env[67893]: ERROR nova.compute.manager [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1760.485732] env[67893]: Faults: ['InvalidArgument'] [ 1760.485732] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Traceback (most recent call last): [ 1760.485732] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1760.485732] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] yield resources [ 1760.485732] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1760.485732] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] self.driver.spawn(context, instance, image_meta, [ 1760.485732] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1760.485732] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1760.485732] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1760.485732] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] self._fetch_image_if_missing(context, vi) [ 1760.485732] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1760.485732] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] image_cache(vi, tmp_image_ds_loc) [ 1760.485732] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1760.485732] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] vm_util.copy_virtual_disk( [ 1760.485732] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1760.485732] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] session._wait_for_task(vmdk_copy_task) [ 1760.485732] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1760.485732] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] return self.wait_for_task(task_ref) [ 1760.485732] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1760.485732] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] return evt.wait() [ 1760.485732] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1760.485732] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] result = hub.switch() [ 1760.485732] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1760.485732] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] return self.greenlet.switch() [ 1760.485732] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1760.485732] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] self.f(*self.args, **self.kw) [ 1760.485732] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1760.485732] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] raise exceptions.translate_fault(task_info.error) [ 1760.485732] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1760.485732] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Faults: ['InvalidArgument'] [ 1760.485732] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] [ 1760.487055] env[67893]: INFO nova.compute.manager [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Terminating instance [ 1760.487681] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1760.487894] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1760.488143] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c3d125e0-f37f-4578-ac41-1fef748dc219 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1760.490373] env[67893]: DEBUG nova.compute.manager [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1760.490580] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1760.491298] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bca1136-53b0-4120-b468-0c99b8cac8cc {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1760.498107] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1760.498309] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9df53a73-7ecb-490d-a373-9b7c94130013 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1760.500406] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1760.500574] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1760.501519] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-acf2add2-a1e9-46f5-ac35-b09c036f57d8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1760.505945] env[67893]: DEBUG oslo_vmware.api [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Waiting for the task: (returnval){ [ 1760.505945] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52e52abc-7850-a436-b879-7d1e0d3e0dc1" [ 1760.505945] env[67893]: _type = "Task" [ 1760.505945] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1760.514891] env[67893]: DEBUG oslo_vmware.api [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52e52abc-7850-a436-b879-7d1e0d3e0dc1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1760.578507] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1760.578711] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1760.578891] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Deleting the datastore file [datastore1] 8dbbc2e6-9993-4bf0-b66b-6e685789221c {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1760.579169] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4661ed00-23c7-4e78-8204-444e25f86b94 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1760.585118] env[67893]: DEBUG oslo_vmware.api [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Waiting for the task: (returnval){ [ 1760.585118] env[67893]: value = "task-3455468" [ 1760.585118] env[67893]: _type = "Task" [ 1760.585118] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1760.592721] env[67893]: DEBUG oslo_vmware.api [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Task: {'id': task-3455468, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1761.016361] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1761.016688] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Creating directory with path [datastore1] vmware_temp/09a715de-2911-4cf3-af36-5e1fe196b827/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1761.016853] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c0efd511-0bad-4f19-b229-b9267c752312 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1761.028968] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Created directory with path [datastore1] vmware_temp/09a715de-2911-4cf3-af36-5e1fe196b827/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1761.029190] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Fetch image to [datastore1] vmware_temp/09a715de-2911-4cf3-af36-5e1fe196b827/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1761.029366] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/09a715de-2911-4cf3-af36-5e1fe196b827/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1761.030141] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00467cee-06f9-4cd0-95b7-1881ce8450e8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1761.036950] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4b20b9f-d4db-422a-b5e8-0997d3e9c995 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1761.045920] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1dbf2eb9-5689-4574-b98f-50bce4010a09 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1761.075349] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34864b47-8e3d-47e9-8567-9a7e7031afdf {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1761.081847] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b9c4de67-0669-45a2-91cb-86b91e83190a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1761.093046] env[67893]: DEBUG oslo_vmware.api [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Task: {'id': task-3455468, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.070009} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1761.093298] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1761.093481] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1761.093653] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1761.093950] env[67893]: INFO nova.compute.manager [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1761.096815] env[67893]: DEBUG nova.compute.claims [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1761.096988] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1761.097214] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1761.105630] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1761.161157] env[67893]: DEBUG oslo_vmware.rw_handles [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/09a715de-2911-4cf3-af36-5e1fe196b827/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1761.226780] env[67893]: DEBUG oslo_vmware.rw_handles [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1761.227526] env[67893]: DEBUG oslo_vmware.rw_handles [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/09a715de-2911-4cf3-af36-5e1fe196b827/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1761.358703] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3e7b551-bde1-4919-8c0d-8f315b8459c8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1761.367605] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cddfdf89-b5e0-45f8-b6de-ce3a725968f1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1761.396534] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5bf1431e-7e82-48e1-8620-d6a77298c337 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1761.404040] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c15c4990-ec3e-42f9-bd4c-ffa2ec72737e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1761.417225] env[67893]: DEBUG nova.compute.provider_tree [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1761.426091] env[67893]: DEBUG nova.scheduler.client.report [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1761.441153] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.344s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1761.441678] env[67893]: ERROR nova.compute.manager [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1761.441678] env[67893]: Faults: ['InvalidArgument'] [ 1761.441678] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Traceback (most recent call last): [ 1761.441678] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1761.441678] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] self.driver.spawn(context, instance, image_meta, [ 1761.441678] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1761.441678] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1761.441678] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1761.441678] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] self._fetch_image_if_missing(context, vi) [ 1761.441678] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1761.441678] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] image_cache(vi, tmp_image_ds_loc) [ 1761.441678] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1761.441678] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] vm_util.copy_virtual_disk( [ 1761.441678] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1761.441678] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] session._wait_for_task(vmdk_copy_task) [ 1761.441678] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1761.441678] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] return self.wait_for_task(task_ref) [ 1761.441678] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1761.441678] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] return evt.wait() [ 1761.441678] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1761.441678] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] result = hub.switch() [ 1761.441678] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1761.441678] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] return self.greenlet.switch() [ 1761.441678] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1761.441678] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] self.f(*self.args, **self.kw) [ 1761.441678] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1761.441678] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] raise exceptions.translate_fault(task_info.error) [ 1761.441678] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1761.441678] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Faults: ['InvalidArgument'] [ 1761.441678] env[67893]: ERROR nova.compute.manager [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] [ 1761.442840] env[67893]: DEBUG nova.compute.utils [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1761.443710] env[67893]: DEBUG nova.compute.manager [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Build of instance 8dbbc2e6-9993-4bf0-b66b-6e685789221c was re-scheduled: A specified parameter was not correct: fileType [ 1761.443710] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1761.444118] env[67893]: DEBUG nova.compute.manager [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1761.444308] env[67893]: DEBUG nova.compute.manager [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1761.444500] env[67893]: DEBUG nova.compute.manager [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1761.444639] env[67893]: DEBUG nova.network.neutron [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1761.709169] env[67893]: DEBUG nova.network.neutron [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1761.719305] env[67893]: INFO nova.compute.manager [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Took 0.27 seconds to deallocate network for instance. [ 1761.812907] env[67893]: INFO nova.scheduler.client.report [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Deleted allocations for instance 8dbbc2e6-9993-4bf0-b66b-6e685789221c [ 1761.836282] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ec80b1b4-1c3a-469e-8a32-22d3d6ec9b8e tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Lock "8dbbc2e6-9993-4bf0-b66b-6e685789221c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 681.498s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1761.837424] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1a07c534-7070-423e-b830-3460699b3392 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Lock "8dbbc2e6-9993-4bf0-b66b-6e685789221c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 485.570s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1761.838025] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1a07c534-7070-423e-b830-3460699b3392 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Acquiring lock "8dbbc2e6-9993-4bf0-b66b-6e685789221c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1761.838025] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1a07c534-7070-423e-b830-3460699b3392 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Lock "8dbbc2e6-9993-4bf0-b66b-6e685789221c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1761.838215] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1a07c534-7070-423e-b830-3460699b3392 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Lock "8dbbc2e6-9993-4bf0-b66b-6e685789221c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1761.840187] env[67893]: INFO nova.compute.manager [None req-1a07c534-7070-423e-b830-3460699b3392 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Terminating instance [ 1761.842043] env[67893]: DEBUG nova.compute.manager [None req-1a07c534-7070-423e-b830-3460699b3392 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1761.842043] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-1a07c534-7070-423e-b830-3460699b3392 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1761.842530] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-28c32199-fb36-4973-8015-51ea73be91b6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1761.851610] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1da83630-19d5-4e03-8404-24789641c7c9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1761.863725] env[67893]: DEBUG nova.compute.manager [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1761.884788] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-1a07c534-7070-423e-b830-3460699b3392 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8dbbc2e6-9993-4bf0-b66b-6e685789221c could not be found. [ 1761.884993] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-1a07c534-7070-423e-b830-3460699b3392 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1761.885186] env[67893]: INFO nova.compute.manager [None req-1a07c534-7070-423e-b830-3460699b3392 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1761.885497] env[67893]: DEBUG oslo.service.loopingcall [None req-1a07c534-7070-423e-b830-3460699b3392 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1761.885659] env[67893]: DEBUG nova.compute.manager [-] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1761.885836] env[67893]: DEBUG nova.network.neutron [-] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1761.908889] env[67893]: DEBUG nova.network.neutron [-] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1761.915991] env[67893]: INFO nova.compute.manager [-] [instance: 8dbbc2e6-9993-4bf0-b66b-6e685789221c] Took 0.03 seconds to deallocate network for instance. [ 1761.921780] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1761.921780] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1761.923163] env[67893]: INFO nova.compute.claims [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1762.008098] env[67893]: DEBUG oslo_concurrency.lockutils [None req-1a07c534-7070-423e-b830-3460699b3392 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Lock "8dbbc2e6-9993-4bf0-b66b-6e685789221c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.171s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1762.120021] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-934404bd-f528-483b-8eaa-534ce6bc05e5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1762.128892] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9dbfb7fd-e178-4c7d-88e3-579a32b6a0aa {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1762.159072] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-349611f0-a7d4-497b-8e91-805b115a80ab {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1762.165724] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99afbb45-e513-4c39-92ea-b806f4f69416 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1762.180123] env[67893]: DEBUG nova.compute.provider_tree [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1762.190924] env[67893]: DEBUG nova.scheduler.client.report [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1762.204552] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.283s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1762.205078] env[67893]: DEBUG nova.compute.manager [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1762.239701] env[67893]: DEBUG nova.compute.utils [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1762.241230] env[67893]: DEBUG nova.compute.manager [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1762.241410] env[67893]: DEBUG nova.network.neutron [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1762.254145] env[67893]: DEBUG nova.compute.manager [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1762.313165] env[67893]: DEBUG nova.policy [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '125ce06a20be4a3aa82550cf33482bba', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ceacadba48b74fc3aeaf5968e3a9a0cd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 1762.320033] env[67893]: DEBUG nova.compute.manager [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1762.345035] env[67893]: DEBUG nova.virt.hardware [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1762.345035] env[67893]: DEBUG nova.virt.hardware [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1762.345035] env[67893]: DEBUG nova.virt.hardware [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1762.345035] env[67893]: DEBUG nova.virt.hardware [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1762.345421] env[67893]: DEBUG nova.virt.hardware [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1762.345706] env[67893]: DEBUG nova.virt.hardware [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1762.346101] env[67893]: DEBUG nova.virt.hardware [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1762.346493] env[67893]: DEBUG nova.virt.hardware [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1762.346794] env[67893]: DEBUG nova.virt.hardware [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1762.347100] env[67893]: DEBUG nova.virt.hardware [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1762.347405] env[67893]: DEBUG nova.virt.hardware [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1762.348406] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2767f2a-caea-48cf-a1f9-f3c96c0fb62b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1762.358474] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5a076c8-fd49-4d21-b3ca-59169cbe706a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1762.666891] env[67893]: DEBUG nova.network.neutron [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Successfully created port: a0f1ad6d-706f-46d9-889a-e7855b92baeb {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1763.253580] env[67893]: DEBUG nova.network.neutron [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Successfully updated port: a0f1ad6d-706f-46d9-889a-e7855b92baeb {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1763.263386] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Acquiring lock "refresh_cache-15893b5f-a02a-4ce7-80c9-eea0658f9ac7" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1763.263538] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Acquired lock "refresh_cache-15893b5f-a02a-4ce7-80c9-eea0658f9ac7" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1763.263688] env[67893]: DEBUG nova.network.neutron [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1763.525054] env[67893]: DEBUG nova.network.neutron [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1763.742864] env[67893]: DEBUG nova.network.neutron [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Updating instance_info_cache with network_info: [{"id": "a0f1ad6d-706f-46d9-889a-e7855b92baeb", "address": "fa:16:3e:c0:09:fb", "network": {"id": "2e27f016-1dcd-4d7f-bcac-afbee70b1806", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-130882583-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ceacadba48b74fc3aeaf5968e3a9a0cd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4fb94adb-cc41-4c16-9830-a3205dbd2bf5", "external-id": "nsx-vlan-transportzone-100", "segmentation_id": 100, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa0f1ad6d-70", "ovs_interfaceid": "a0f1ad6d-706f-46d9-889a-e7855b92baeb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1763.745972] env[67893]: DEBUG nova.compute.manager [req-f2fd1fa3-9226-4bab-a632-810a6d404050 req-305a399e-7e48-4523-ba20-c4c742fcb889 service nova] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Received event network-vif-plugged-a0f1ad6d-706f-46d9-889a-e7855b92baeb {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1763.746211] env[67893]: DEBUG oslo_concurrency.lockutils [req-f2fd1fa3-9226-4bab-a632-810a6d404050 req-305a399e-7e48-4523-ba20-c4c742fcb889 service nova] Acquiring lock "15893b5f-a02a-4ce7-80c9-eea0658f9ac7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1763.746415] env[67893]: DEBUG oslo_concurrency.lockutils [req-f2fd1fa3-9226-4bab-a632-810a6d404050 req-305a399e-7e48-4523-ba20-c4c742fcb889 service nova] Lock "15893b5f-a02a-4ce7-80c9-eea0658f9ac7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1763.746584] env[67893]: DEBUG oslo_concurrency.lockutils [req-f2fd1fa3-9226-4bab-a632-810a6d404050 req-305a399e-7e48-4523-ba20-c4c742fcb889 service nova] Lock "15893b5f-a02a-4ce7-80c9-eea0658f9ac7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1763.746752] env[67893]: DEBUG nova.compute.manager [req-f2fd1fa3-9226-4bab-a632-810a6d404050 req-305a399e-7e48-4523-ba20-c4c742fcb889 service nova] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] No waiting events found dispatching network-vif-plugged-a0f1ad6d-706f-46d9-889a-e7855b92baeb {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1763.746920] env[67893]: WARNING nova.compute.manager [req-f2fd1fa3-9226-4bab-a632-810a6d404050 req-305a399e-7e48-4523-ba20-c4c742fcb889 service nova] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Received unexpected event network-vif-plugged-a0f1ad6d-706f-46d9-889a-e7855b92baeb for instance with vm_state building and task_state spawning. [ 1763.747096] env[67893]: DEBUG nova.compute.manager [req-f2fd1fa3-9226-4bab-a632-810a6d404050 req-305a399e-7e48-4523-ba20-c4c742fcb889 service nova] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Received event network-changed-a0f1ad6d-706f-46d9-889a-e7855b92baeb {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1763.747255] env[67893]: DEBUG nova.compute.manager [req-f2fd1fa3-9226-4bab-a632-810a6d404050 req-305a399e-7e48-4523-ba20-c4c742fcb889 service nova] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Refreshing instance network info cache due to event network-changed-a0f1ad6d-706f-46d9-889a-e7855b92baeb. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1763.747420] env[67893]: DEBUG oslo_concurrency.lockutils [req-f2fd1fa3-9226-4bab-a632-810a6d404050 req-305a399e-7e48-4523-ba20-c4c742fcb889 service nova] Acquiring lock "refresh_cache-15893b5f-a02a-4ce7-80c9-eea0658f9ac7" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1763.754270] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Releasing lock "refresh_cache-15893b5f-a02a-4ce7-80c9-eea0658f9ac7" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1763.754539] env[67893]: DEBUG nova.compute.manager [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Instance network_info: |[{"id": "a0f1ad6d-706f-46d9-889a-e7855b92baeb", "address": "fa:16:3e:c0:09:fb", "network": {"id": "2e27f016-1dcd-4d7f-bcac-afbee70b1806", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-130882583-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ceacadba48b74fc3aeaf5968e3a9a0cd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4fb94adb-cc41-4c16-9830-a3205dbd2bf5", "external-id": "nsx-vlan-transportzone-100", "segmentation_id": 100, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa0f1ad6d-70", "ovs_interfaceid": "a0f1ad6d-706f-46d9-889a-e7855b92baeb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1763.755058] env[67893]: DEBUG oslo_concurrency.lockutils [req-f2fd1fa3-9226-4bab-a632-810a6d404050 req-305a399e-7e48-4523-ba20-c4c742fcb889 service nova] Acquired lock "refresh_cache-15893b5f-a02a-4ce7-80c9-eea0658f9ac7" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1763.755266] env[67893]: DEBUG nova.network.neutron [req-f2fd1fa3-9226-4bab-a632-810a6d404050 req-305a399e-7e48-4523-ba20-c4c742fcb889 service nova] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Refreshing network info cache for port a0f1ad6d-706f-46d9-889a-e7855b92baeb {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1763.756259] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c0:09:fb', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '4fb94adb-cc41-4c16-9830-a3205dbd2bf5', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a0f1ad6d-706f-46d9-889a-e7855b92baeb', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1763.764160] env[67893]: DEBUG oslo.service.loopingcall [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1763.767235] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1763.767670] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5dc94be9-806e-4129-847f-b403dfe92c3a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1763.787312] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1763.787312] env[67893]: value = "task-3455469" [ 1763.787312] env[67893]: _type = "Task" [ 1763.787312] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1763.796271] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455469, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1764.019410] env[67893]: DEBUG nova.network.neutron [req-f2fd1fa3-9226-4bab-a632-810a6d404050 req-305a399e-7e48-4523-ba20-c4c742fcb889 service nova] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Updated VIF entry in instance network info cache for port a0f1ad6d-706f-46d9-889a-e7855b92baeb. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1764.019872] env[67893]: DEBUG nova.network.neutron [req-f2fd1fa3-9226-4bab-a632-810a6d404050 req-305a399e-7e48-4523-ba20-c4c742fcb889 service nova] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Updating instance_info_cache with network_info: [{"id": "a0f1ad6d-706f-46d9-889a-e7855b92baeb", "address": "fa:16:3e:c0:09:fb", "network": {"id": "2e27f016-1dcd-4d7f-bcac-afbee70b1806", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-130882583-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ceacadba48b74fc3aeaf5968e3a9a0cd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4fb94adb-cc41-4c16-9830-a3205dbd2bf5", "external-id": "nsx-vlan-transportzone-100", "segmentation_id": 100, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa0f1ad6d-70", "ovs_interfaceid": "a0f1ad6d-706f-46d9-889a-e7855b92baeb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1764.030673] env[67893]: DEBUG oslo_concurrency.lockutils [req-f2fd1fa3-9226-4bab-a632-810a6d404050 req-305a399e-7e48-4523-ba20-c4c742fcb889 service nova] Releasing lock "refresh_cache-15893b5f-a02a-4ce7-80c9-eea0658f9ac7" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1764.299080] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455469, 'name': CreateVM_Task, 'duration_secs': 0.289645} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1764.299367] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1764.299938] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1764.300115] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1764.300438] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1764.300686] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b21959bf-9bcd-48ea-898b-71d394745308 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1764.305187] env[67893]: DEBUG oslo_vmware.api [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Waiting for the task: (returnval){ [ 1764.305187] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]526b426b-b44f-933e-e7c7-d7d744498118" [ 1764.305187] env[67893]: _type = "Task" [ 1764.305187] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1764.312604] env[67893]: DEBUG oslo_vmware.api [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]526b426b-b44f-933e-e7c7-d7d744498118, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1764.815744] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1764.816021] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1764.816232] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1775.605106] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "7169c720-f69e-40a3-95d2-473639884cd9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1775.605463] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "7169c720-f69e-40a3-95d2-473639884cd9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1775.857512] env[67893]: DEBUG oslo_concurrency.lockutils [None req-e4b69b06-dc8d-4a0c-8630-110aa29c0fd4 tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Acquiring lock "9fc9f6b0-928e-46b4-ad7c-9217b2f31575" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1784.859749] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1784.860122] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Cleaning up deleted instances with incomplete migration {{(pid=67893) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1786.870387] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1790.859759] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1790.860108] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1790.860108] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1790.881255] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1790.881449] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1790.881617] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1790.881764] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1790.881926] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1790.882096] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1790.882247] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1790.882392] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1790.882533] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1790.882774] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1790.882850] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1793.858595] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1793.858904] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1794.859093] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1794.859460] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1795.859522] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1795.859927] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1795.859927] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1795.860160] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1797.863054] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1797.883809] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1797.883993] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Cleaning up deleted instances {{(pid=67893) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1797.892387] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] There are 0 instances to clean {{(pid=67893) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1798.867566] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1798.879416] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1798.879663] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1798.879830] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1798.880039] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1798.881191] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fb03cdf-4e9d-4e17-ab4e-5c240af779ed {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1798.890479] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47f452e7-52ea-457f-a43a-6bb44e042cd4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1798.904186] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-230df4af-ae15-4e1d-9b67-0cdca3a34fab {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1798.910243] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f42a888b-611b-466c-8c8f-3a526f0ee23c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1798.938879] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180926MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1798.939043] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1798.939240] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1799.094045] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1799.094160] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 021f1a86-6015-4a22-b501-3ec9079edbec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1799.094255] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 25d67f98-c132-434b-9d22-4569585527eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1799.094381] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 41b5c5ec-936a-4abe-9db7-38d0d2aa371d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1799.094505] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance e1849daf-3781-42ef-bede-267efbb652c9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1799.094626] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2875b0a3-0213-4908-b86b-ce45a8901553 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1799.094743] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance dfb92d1c-c2a5-49c1-8526-3743cb385c97 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1799.094861] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance ad60df35-54c0-459e-8a25-981922ae0a88 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1799.094980] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 9fc9f6b0-928e-46b4-ad7c-9217b2f31575 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1799.095160] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 15893b5f-a02a-4ce7-80c9-eea0658f9ac7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1799.105991] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 94760898-4f3c-4f41-85be-366f4108d0ba has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1799.115868] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 11000d92-0094-4561-a807-ca76610ea549 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1799.125782] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 7169c720-f69e-40a3-95d2-473639884cd9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1799.126010] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1799.126176] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1799.143697] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Refreshing inventories for resource provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1799.158425] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Updating ProviderTree inventory for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1799.158642] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Updating inventory in ProviderTree for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1799.172431] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Refreshing aggregate associations for resource provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57, aggregates: None {{(pid=67893) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1799.191097] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Refreshing trait associations for resource provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO {{(pid=67893) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1799.343133] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2d1373d-82f0-425c-ba5c-67b8391fe779 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1799.352349] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-824f00b9-2244-4f58-ba5f-159acf0ed4ab {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1799.381304] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f5564fe-37eb-47ae-9cc9-89bc7fc495d2 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1799.388465] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2572e38f-82c5-42f7-abdb-316a5c311c5d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1799.401860] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1799.409788] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1799.426055] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1799.426241] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.487s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1804.325623] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._sync_power_states {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1804.349942] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Getting list of instances from cluster (obj){ [ 1804.349942] env[67893]: value = "domain-c8" [ 1804.349942] env[67893]: _type = "ClusterComputeResource" [ 1804.349942] env[67893]: } {{(pid=67893) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1804.351289] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-351bd3fa-4bae-4772-bb15-a6dc0bed8ac4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1804.368599] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Got total of 10 instances {{(pid=67893) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1804.368772] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Triggering sync for uuid 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0 {{(pid=67893) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1804.368962] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Triggering sync for uuid 021f1a86-6015-4a22-b501-3ec9079edbec {{(pid=67893) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1804.369141] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Triggering sync for uuid 25d67f98-c132-434b-9d22-4569585527eb {{(pid=67893) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1804.369305] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Triggering sync for uuid 41b5c5ec-936a-4abe-9db7-38d0d2aa371d {{(pid=67893) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1804.369458] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Triggering sync for uuid e1849daf-3781-42ef-bede-267efbb652c9 {{(pid=67893) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1804.369609] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Triggering sync for uuid 2875b0a3-0213-4908-b86b-ce45a8901553 {{(pid=67893) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1804.369759] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Triggering sync for uuid dfb92d1c-c2a5-49c1-8526-3743cb385c97 {{(pid=67893) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1804.369911] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Triggering sync for uuid ad60df35-54c0-459e-8a25-981922ae0a88 {{(pid=67893) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1804.370092] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Triggering sync for uuid 9fc9f6b0-928e-46b4-ad7c-9217b2f31575 {{(pid=67893) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1804.370264] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Triggering sync for uuid 15893b5f-a02a-4ce7-80c9-eea0658f9ac7 {{(pid=67893) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1804.370582] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "1068cd1b-317e-42d5-b348-5bfdbb2b4dc0" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1804.370900] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "021f1a86-6015-4a22-b501-3ec9079edbec" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1804.371128] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "25d67f98-c132-434b-9d22-4569585527eb" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1804.371333] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "41b5c5ec-936a-4abe-9db7-38d0d2aa371d" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1804.371529] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "e1849daf-3781-42ef-bede-267efbb652c9" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1804.371725] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "2875b0a3-0213-4908-b86b-ce45a8901553" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1804.371917] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "dfb92d1c-c2a5-49c1-8526-3743cb385c97" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1804.372126] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "ad60df35-54c0-459e-8a25-981922ae0a88" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1804.372320] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "9fc9f6b0-928e-46b4-ad7c-9217b2f31575" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1804.372515] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "15893b5f-a02a-4ce7-80c9-eea0658f9ac7" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1810.971690] env[67893]: WARNING oslo_vmware.rw_handles [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1810.971690] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1810.971690] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1810.971690] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1810.971690] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1810.971690] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 1810.971690] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1810.971690] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1810.971690] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1810.971690] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1810.971690] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1810.971690] env[67893]: ERROR oslo_vmware.rw_handles [ 1810.972548] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/09a715de-2911-4cf3-af36-5e1fe196b827/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1810.974407] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1810.974676] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Copying Virtual Disk [datastore1] vmware_temp/09a715de-2911-4cf3-af36-5e1fe196b827/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/09a715de-2911-4cf3-af36-5e1fe196b827/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1810.974965] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a220b920-4f8e-4d43-9cb6-9e47cd827c5c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1810.983280] env[67893]: DEBUG oslo_vmware.api [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Waiting for the task: (returnval){ [ 1810.983280] env[67893]: value = "task-3455470" [ 1810.983280] env[67893]: _type = "Task" [ 1810.983280] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1810.991216] env[67893]: DEBUG oslo_vmware.api [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Task: {'id': task-3455470, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1811.494514] env[67893]: DEBUG oslo_vmware.exceptions [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1811.494653] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1811.495675] env[67893]: ERROR nova.compute.manager [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1811.495675] env[67893]: Faults: ['InvalidArgument'] [ 1811.495675] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Traceback (most recent call last): [ 1811.495675] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1811.495675] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] yield resources [ 1811.495675] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1811.495675] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] self.driver.spawn(context, instance, image_meta, [ 1811.495675] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1811.495675] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1811.495675] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1811.495675] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] self._fetch_image_if_missing(context, vi) [ 1811.495675] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1811.495675] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] image_cache(vi, tmp_image_ds_loc) [ 1811.495675] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1811.495675] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] vm_util.copy_virtual_disk( [ 1811.495675] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1811.495675] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] session._wait_for_task(vmdk_copy_task) [ 1811.495675] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1811.495675] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] return self.wait_for_task(task_ref) [ 1811.495675] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1811.495675] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] return evt.wait() [ 1811.495675] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1811.495675] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] result = hub.switch() [ 1811.495675] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1811.495675] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] return self.greenlet.switch() [ 1811.495675] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1811.495675] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] self.f(*self.args, **self.kw) [ 1811.495675] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1811.495675] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] raise exceptions.translate_fault(task_info.error) [ 1811.495675] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1811.495675] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Faults: ['InvalidArgument'] [ 1811.495675] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] [ 1811.495675] env[67893]: INFO nova.compute.manager [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Terminating instance [ 1811.497016] env[67893]: DEBUG oslo_concurrency.lockutils [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1811.497233] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1811.497481] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ec75a4be-3d6e-4512-b5a8-45d146ca8d45 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1811.501038] env[67893]: DEBUG nova.compute.manager [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1811.501287] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1811.502139] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2e44e56-5815-4045-bd74-44876abe338a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1811.509484] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1811.509736] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9c2413d0-3ddd-4d7b-9acc-073490eef5f9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1811.512145] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1811.512378] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1811.513382] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-39a70e30-f676-452e-bb2d-3b722af6932b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1811.518150] env[67893]: DEBUG oslo_vmware.api [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Waiting for the task: (returnval){ [ 1811.518150] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]526e776e-fa31-eea8-1bcb-385da2325050" [ 1811.518150] env[67893]: _type = "Task" [ 1811.518150] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1811.525454] env[67893]: DEBUG oslo_vmware.api [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]526e776e-fa31-eea8-1bcb-385da2325050, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1811.591081] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1811.591263] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1811.591389] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Deleting the datastore file [datastore1] 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0 {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1811.591654] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-742f7dfb-8679-43f5-9447-e283b6e4710d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1811.597404] env[67893]: DEBUG oslo_vmware.api [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Waiting for the task: (returnval){ [ 1811.597404] env[67893]: value = "task-3455472" [ 1811.597404] env[67893]: _type = "Task" [ 1811.597404] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1811.604744] env[67893]: DEBUG oslo_vmware.api [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Task: {'id': task-3455472, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1812.028501] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1812.028884] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Creating directory with path [datastore1] vmware_temp/af3c53a0-4ea5-479d-b7c7-39edbb11414e/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1812.029033] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-017e81b7-06b8-4f48-81e2-369d4f16c0d9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1812.039762] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Created directory with path [datastore1] vmware_temp/af3c53a0-4ea5-479d-b7c7-39edbb11414e/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1812.040139] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Fetch image to [datastore1] vmware_temp/af3c53a0-4ea5-479d-b7c7-39edbb11414e/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1812.040286] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/af3c53a0-4ea5-479d-b7c7-39edbb11414e/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1812.040855] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f751d71-7796-4365-afa7-388ed9fa5153 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1812.046965] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5c83228-a876-46ef-9b8d-11572dcd8d07 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1812.055698] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71d8ba65-b64e-4692-aa38-bdf452837236 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1812.085950] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7a66e85-1987-416f-9cbf-66dec75cebf3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1812.091214] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c0efae84-c35c-48bb-a01c-50f3210dd254 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1812.104989] env[67893]: DEBUG oslo_vmware.api [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Task: {'id': task-3455472, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065515} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1812.105271] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1812.105474] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1812.105688] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1812.105873] env[67893]: INFO nova.compute.manager [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1812.107994] env[67893]: DEBUG nova.compute.claims [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1812.108200] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1812.108464] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1812.112596] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1812.253087] env[67893]: DEBUG oslo_vmware.rw_handles [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/af3c53a0-4ea5-479d-b7c7-39edbb11414e/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1812.312860] env[67893]: DEBUG oslo_vmware.rw_handles [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1812.313056] env[67893]: DEBUG oslo_vmware.rw_handles [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/af3c53a0-4ea5-479d-b7c7-39edbb11414e/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1812.364196] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fddaa703-edde-4721-b48b-7e8ba7f48990 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1812.371809] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b533c46f-e259-4ea5-98df-8fd5b560db48 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1812.401596] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e27e7a73-ab30-4333-8307-a77e43179ab1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1812.410023] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-203e26b4-9314-4130-b905-a5fb4383eb31 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1812.421769] env[67893]: DEBUG nova.compute.provider_tree [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1812.431029] env[67893]: DEBUG nova.scheduler.client.report [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1812.446785] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.338s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1812.446833] env[67893]: ERROR nova.compute.manager [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1812.446833] env[67893]: Faults: ['InvalidArgument'] [ 1812.446833] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Traceback (most recent call last): [ 1812.446833] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1812.446833] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] self.driver.spawn(context, instance, image_meta, [ 1812.446833] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1812.446833] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1812.446833] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1812.446833] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] self._fetch_image_if_missing(context, vi) [ 1812.446833] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1812.446833] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] image_cache(vi, tmp_image_ds_loc) [ 1812.446833] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1812.446833] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] vm_util.copy_virtual_disk( [ 1812.446833] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1812.446833] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] session._wait_for_task(vmdk_copy_task) [ 1812.446833] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1812.446833] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] return self.wait_for_task(task_ref) [ 1812.446833] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1812.446833] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] return evt.wait() [ 1812.446833] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1812.446833] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] result = hub.switch() [ 1812.446833] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1812.446833] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] return self.greenlet.switch() [ 1812.446833] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1812.446833] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] self.f(*self.args, **self.kw) [ 1812.446833] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1812.446833] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] raise exceptions.translate_fault(task_info.error) [ 1812.446833] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1812.446833] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Faults: ['InvalidArgument'] [ 1812.446833] env[67893]: ERROR nova.compute.manager [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] [ 1812.447887] env[67893]: DEBUG nova.compute.utils [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1812.448974] env[67893]: DEBUG nova.compute.manager [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Build of instance 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0 was re-scheduled: A specified parameter was not correct: fileType [ 1812.448974] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1812.449359] env[67893]: DEBUG nova.compute.manager [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1812.449594] env[67893]: DEBUG nova.compute.manager [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1812.449752] env[67893]: DEBUG nova.compute.manager [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1812.449919] env[67893]: DEBUG nova.network.neutron [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1812.894594] env[67893]: DEBUG nova.network.neutron [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1812.909301] env[67893]: INFO nova.compute.manager [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Took 0.46 seconds to deallocate network for instance. [ 1813.007118] env[67893]: INFO nova.scheduler.client.report [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Deleted allocations for instance 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0 [ 1813.028692] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4c2a6d4f-3a2b-47fc-b093-3af1181ff9d8 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Lock "1068cd1b-317e-42d5-b348-5bfdbb2b4dc0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 630.921s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1813.029917] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b52b7eb6-0937-4f37-9f1e-6f0a8f8bd897 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Lock "1068cd1b-317e-42d5-b348-5bfdbb2b4dc0" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 435.239s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1813.030233] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b52b7eb6-0937-4f37-9f1e-6f0a8f8bd897 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Acquiring lock "1068cd1b-317e-42d5-b348-5bfdbb2b4dc0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1813.030462] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b52b7eb6-0937-4f37-9f1e-6f0a8f8bd897 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Lock "1068cd1b-317e-42d5-b348-5bfdbb2b4dc0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1813.030626] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b52b7eb6-0937-4f37-9f1e-6f0a8f8bd897 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Lock "1068cd1b-317e-42d5-b348-5bfdbb2b4dc0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1813.032760] env[67893]: INFO nova.compute.manager [None req-b52b7eb6-0937-4f37-9f1e-6f0a8f8bd897 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Terminating instance [ 1813.034471] env[67893]: DEBUG nova.compute.manager [None req-b52b7eb6-0937-4f37-9f1e-6f0a8f8bd897 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1813.034956] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-b52b7eb6-0937-4f37-9f1e-6f0a8f8bd897 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1813.035148] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f8c8924b-610d-4f90-a55d-34060dee817c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1813.044245] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d303c969-b75c-4a1e-9f20-fc391ffe9a92 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1813.056864] env[67893]: DEBUG nova.compute.manager [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1813.078499] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-b52b7eb6-0937-4f37-9f1e-6f0a8f8bd897 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0 could not be found. [ 1813.078742] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-b52b7eb6-0937-4f37-9f1e-6f0a8f8bd897 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1813.078958] env[67893]: INFO nova.compute.manager [None req-b52b7eb6-0937-4f37-9f1e-6f0a8f8bd897 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1813.079669] env[67893]: DEBUG oslo.service.loopingcall [None req-b52b7eb6-0937-4f37-9f1e-6f0a8f8bd897 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1813.079669] env[67893]: DEBUG nova.compute.manager [-] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1813.079669] env[67893]: DEBUG nova.network.neutron [-] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1813.106442] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1813.106701] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1813.108136] env[67893]: INFO nova.compute.claims [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1813.110892] env[67893]: DEBUG nova.network.neutron [-] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1813.118357] env[67893]: INFO nova.compute.manager [-] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] Took 0.04 seconds to deallocate network for instance. [ 1813.210982] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b52b7eb6-0937-4f37-9f1e-6f0a8f8bd897 tempest-ServersTestFqdnHostnames-926898831 tempest-ServersTestFqdnHostnames-926898831-project-member] Lock "1068cd1b-317e-42d5-b348-5bfdbb2b4dc0" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.181s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1813.211849] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "1068cd1b-317e-42d5-b348-5bfdbb2b4dc0" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 8.841s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1813.212049] env[67893]: INFO nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 1068cd1b-317e-42d5-b348-5bfdbb2b4dc0] During sync_power_state the instance has a pending task (deleting). Skip. [ 1813.212228] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "1068cd1b-317e-42d5-b348-5bfdbb2b4dc0" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1813.302499] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc554b95-190e-4e08-a64b-d97e774a70d8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1813.311818] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40d74b3e-6071-4d8f-92ea-e55db8ad4cf3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1813.343630] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48369dc6-4754-4bea-a0ca-00bfb4952dc2 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1813.351128] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-699feb1f-4644-4846-9360-310b98be506a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1813.366009] env[67893]: DEBUG nova.compute.provider_tree [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1813.375339] env[67893]: DEBUG nova.scheduler.client.report [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1813.390220] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.283s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1813.390783] env[67893]: DEBUG nova.compute.manager [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1813.425736] env[67893]: DEBUG nova.compute.utils [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1813.427233] env[67893]: DEBUG nova.compute.manager [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Not allocating networking since 'none' was specified. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1813.437342] env[67893]: DEBUG nova.compute.manager [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1813.501314] env[67893]: DEBUG nova.compute.manager [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1813.528386] env[67893]: DEBUG nova.virt.hardware [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1813.528646] env[67893]: DEBUG nova.virt.hardware [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1813.528802] env[67893]: DEBUG nova.virt.hardware [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1813.528982] env[67893]: DEBUG nova.virt.hardware [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1813.529165] env[67893]: DEBUG nova.virt.hardware [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1813.529319] env[67893]: DEBUG nova.virt.hardware [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1813.529534] env[67893]: DEBUG nova.virt.hardware [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1813.529698] env[67893]: DEBUG nova.virt.hardware [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1813.529865] env[67893]: DEBUG nova.virt.hardware [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1813.530211] env[67893]: DEBUG nova.virt.hardware [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1813.530453] env[67893]: DEBUG nova.virt.hardware [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1813.531927] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ef5ebe6-fe63-4bf0-b4f0-a228be31fd02 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1813.541873] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6bfed83-432e-4603-9283-74d6afe736f5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1813.555984] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Instance VIF info [] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1813.561446] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Creating folder: Project (14e9d57234e44ce1bf94363d022b9b2e). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1813.561703] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d722d814-8d22-4a38-90ec-fd2382a3db4c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1813.573052] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Created folder: Project (14e9d57234e44ce1bf94363d022b9b2e) in parent group-v689771. [ 1813.573238] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Creating folder: Instances. Parent ref: group-v689870. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1813.573473] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-48a63448-e308-4c82-8d9d-38c379bd075e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1813.581345] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Created folder: Instances in parent group-v689870. [ 1813.581567] env[67893]: DEBUG oslo.service.loopingcall [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1813.581798] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1813.581925] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8d107fb4-97c5-4f04-b77c-0dcde671ad14 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1813.597320] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1813.597320] env[67893]: value = "task-3455475" [ 1813.597320] env[67893]: _type = "Task" [ 1813.597320] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1813.603948] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455475, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1814.107359] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455475, 'name': CreateVM_Task, 'duration_secs': 0.235855} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1814.107731] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1814.108043] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1814.108324] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1814.108690] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1814.108997] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-def50443-33fa-4ae9-b583-61ec347b8b3d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1814.113347] env[67893]: DEBUG oslo_vmware.api [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Waiting for the task: (returnval){ [ 1814.113347] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]527d08df-f268-68f7-24b4-a96a61fd9ac2" [ 1814.113347] env[67893]: _type = "Task" [ 1814.113347] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1814.121947] env[67893]: DEBUG oslo_vmware.api [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]527d08df-f268-68f7-24b4-a96a61fd9ac2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1814.624363] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1814.624630] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1814.624841] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1831.365775] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquiring lock "72410dc2-74d9-4d59-bdd1-ad45b01c482b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1831.366125] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "72410dc2-74d9-4d59-bdd1-ad45b01c482b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1843.722129] env[67893]: DEBUG oslo_concurrency.lockutils [None req-cdbedd38-46d7-4dba-8be4-354990c429f5 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Acquiring lock "15893b5f-a02a-4ce7-80c9-eea0658f9ac7" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1848.907127] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1852.860133] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1852.860630] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1852.860630] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1852.883054] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1852.883216] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1852.883349] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1852.883473] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1852.883600] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1852.883719] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1852.883838] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1852.883957] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1852.884089] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1852.884208] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1852.884325] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1853.859455] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1854.854618] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1854.858454] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1854.858625] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1855.859988] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1855.860242] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1856.859609] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1859.993765] env[67893]: WARNING oslo_vmware.rw_handles [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1859.993765] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1859.993765] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1859.993765] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1859.993765] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1859.993765] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 1859.993765] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1859.993765] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1859.993765] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1859.993765] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1859.993765] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1859.993765] env[67893]: ERROR oslo_vmware.rw_handles [ 1859.994500] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/af3c53a0-4ea5-479d-b7c7-39edbb11414e/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1859.996082] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1859.996325] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Copying Virtual Disk [datastore1] vmware_temp/af3c53a0-4ea5-479d-b7c7-39edbb11414e/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/af3c53a0-4ea5-479d-b7c7-39edbb11414e/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1859.996602] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-532cd8b0-2122-499b-b97d-c9766d01e11c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1860.004675] env[67893]: DEBUG oslo_vmware.api [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Waiting for the task: (returnval){ [ 1860.004675] env[67893]: value = "task-3455476" [ 1860.004675] env[67893]: _type = "Task" [ 1860.004675] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1860.012416] env[67893]: DEBUG oslo_vmware.api [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Task: {'id': task-3455476, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1860.514394] env[67893]: DEBUG oslo_vmware.exceptions [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1860.514668] env[67893]: DEBUG oslo_concurrency.lockutils [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1860.515258] env[67893]: ERROR nova.compute.manager [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1860.515258] env[67893]: Faults: ['InvalidArgument'] [ 1860.515258] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Traceback (most recent call last): [ 1860.515258] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1860.515258] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] yield resources [ 1860.515258] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1860.515258] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] self.driver.spawn(context, instance, image_meta, [ 1860.515258] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1860.515258] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1860.515258] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1860.515258] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] self._fetch_image_if_missing(context, vi) [ 1860.515258] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1860.515258] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] image_cache(vi, tmp_image_ds_loc) [ 1860.515258] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1860.515258] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] vm_util.copy_virtual_disk( [ 1860.515258] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1860.515258] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] session._wait_for_task(vmdk_copy_task) [ 1860.515258] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1860.515258] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] return self.wait_for_task(task_ref) [ 1860.515258] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1860.515258] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] return evt.wait() [ 1860.515258] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1860.515258] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] result = hub.switch() [ 1860.515258] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1860.515258] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] return self.greenlet.switch() [ 1860.515258] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1860.515258] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] self.f(*self.args, **self.kw) [ 1860.515258] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1860.515258] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] raise exceptions.translate_fault(task_info.error) [ 1860.515258] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1860.515258] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Faults: ['InvalidArgument'] [ 1860.515258] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] [ 1860.516248] env[67893]: INFO nova.compute.manager [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Terminating instance [ 1860.517143] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1860.517368] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1860.517600] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7719ee16-757a-4a28-9d74-b49336183e7e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1860.519927] env[67893]: DEBUG nova.compute.manager [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1860.520135] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1860.520877] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f761657-998c-477d-9aab-6a923883e38d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1860.527280] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1860.527481] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-dcbc3d41-1c5e-4a63-8745-09e61c270a5e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1860.529545] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1860.529721] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1860.530675] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0fc0f438-1a75-473c-9710-32c571d2f374 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1860.535181] env[67893]: DEBUG oslo_vmware.api [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Waiting for the task: (returnval){ [ 1860.535181] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]5239f4e1-4b1b-3561-85a9-b638b8d0edf4" [ 1860.535181] env[67893]: _type = "Task" [ 1860.535181] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1860.541973] env[67893]: DEBUG oslo_vmware.api [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]5239f4e1-4b1b-3561-85a9-b638b8d0edf4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1860.594718] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1860.594929] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1860.595168] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Deleting the datastore file [datastore1] 021f1a86-6015-4a22-b501-3ec9079edbec {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1860.595436] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-5cab12e7-feb1-4880-96ad-b08f4ff81372 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1860.602616] env[67893]: DEBUG oslo_vmware.api [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Waiting for the task: (returnval){ [ 1860.602616] env[67893]: value = "task-3455478" [ 1860.602616] env[67893]: _type = "Task" [ 1860.602616] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1860.611422] env[67893]: DEBUG oslo_vmware.api [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Task: {'id': task-3455478, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1860.859366] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1860.871326] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1860.871553] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1860.871747] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1860.871909] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1860.873043] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39098291-80a4-4542-af2d-4c8d8011093a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1860.881273] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ee034bf-896e-4c52-b7cd-11a232aae94e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1860.894940] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ce5adca-76e4-4024-b65f-2a63b66453d0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1860.900976] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50c1057f-461d-4515-beee-4fa36b523e0f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1860.929241] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180962MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1860.929378] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1860.929522] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1861.006707] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 021f1a86-6015-4a22-b501-3ec9079edbec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1861.006942] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 25d67f98-c132-434b-9d22-4569585527eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1861.007011] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 41b5c5ec-936a-4abe-9db7-38d0d2aa371d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1861.007147] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance e1849daf-3781-42ef-bede-267efbb652c9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1861.007265] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2875b0a3-0213-4908-b86b-ce45a8901553 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1861.007382] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance dfb92d1c-c2a5-49c1-8526-3743cb385c97 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1861.007496] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance ad60df35-54c0-459e-8a25-981922ae0a88 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1861.007612] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 9fc9f6b0-928e-46b4-ad7c-9217b2f31575 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1861.007723] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 15893b5f-a02a-4ce7-80c9-eea0658f9ac7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1861.007907] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 94760898-4f3c-4f41-85be-366f4108d0ba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1861.020399] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 11000d92-0094-4561-a807-ca76610ea549 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1861.047882] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1861.047882] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Creating directory with path [datastore1] vmware_temp/c93074ab-f5e9-4e47-93ff-eacfd1fe3fb4/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1861.048211] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-13deae4c-4976-4b09-8b3d-614d59d2e2ac {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.052626] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 7169c720-f69e-40a3-95d2-473639884cd9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1861.059369] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Created directory with path [datastore1] vmware_temp/c93074ab-f5e9-4e47-93ff-eacfd1fe3fb4/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1861.059563] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Fetch image to [datastore1] vmware_temp/c93074ab-f5e9-4e47-93ff-eacfd1fe3fb4/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1861.059733] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/c93074ab-f5e9-4e47-93ff-eacfd1fe3fb4/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1861.060508] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5b547aa-cae4-48f7-b355-05e4b97fdaf6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.063859] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 72410dc2-74d9-4d59-bdd1-ad45b01c482b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1861.064099] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1861.064251] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1861.072524] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddbc5f7d-3d18-4664-b3aa-bf86df4a9cbe {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.084158] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-906a118d-a214-4d43-8b99-978f3025c912 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.119809] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e4e4870-2ac5-441f-b700-bf94df888d37 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.127029] env[67893]: DEBUG oslo_vmware.api [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Task: {'id': task-3455478, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075399} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1861.128465] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1861.128663] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1861.128840] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1861.129015] env[67893]: INFO nova.compute.manager [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1861.130793] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c7225098-a3c8-465c-b4fe-aab9b00e5cd7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.134618] env[67893]: DEBUG nova.compute.claims [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1861.134792] env[67893]: DEBUG oslo_concurrency.lockutils [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1861.158940] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1861.210192] env[67893]: DEBUG oslo_vmware.rw_handles [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c93074ab-f5e9-4e47-93ff-eacfd1fe3fb4/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1861.273052] env[67893]: DEBUG oslo_vmware.rw_handles [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1861.273168] env[67893]: DEBUG oslo_vmware.rw_handles [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c93074ab-f5e9-4e47-93ff-eacfd1fe3fb4/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1861.312556] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d17f5cd-a86b-4a5b-a706-796956dea770 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.320805] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a2d16e0-4500-442b-ab8f-e7e3ba487a32 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.354281] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6e4595b-d61c-4dd7-8b9f-4dd2932504f7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.361545] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc693ec6-2220-4af3-a299-3330704baa71 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.374209] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1861.382587] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1861.396051] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1861.396227] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.467s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1861.396478] env[67893]: DEBUG oslo_concurrency.lockutils [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.262s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1861.578804] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88d3fa0a-390b-4f1b-b40e-364d33f87dcf {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.588627] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12ed245f-a5a8-4551-bf0a-87a8b16b1814 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.625956] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3713039-5c25-4fcf-b19b-9167adb2b906 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.633088] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-158be93a-66a0-4663-a8df-9e7d94195adf {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.645782] env[67893]: DEBUG nova.compute.provider_tree [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1861.654608] env[67893]: DEBUG nova.scheduler.client.report [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1861.667384] env[67893]: DEBUG oslo_concurrency.lockutils [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.271s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1861.667889] env[67893]: ERROR nova.compute.manager [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1861.667889] env[67893]: Faults: ['InvalidArgument'] [ 1861.667889] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Traceback (most recent call last): [ 1861.667889] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1861.667889] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] self.driver.spawn(context, instance, image_meta, [ 1861.667889] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1861.667889] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1861.667889] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1861.667889] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] self._fetch_image_if_missing(context, vi) [ 1861.667889] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1861.667889] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] image_cache(vi, tmp_image_ds_loc) [ 1861.667889] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1861.667889] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] vm_util.copy_virtual_disk( [ 1861.667889] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1861.667889] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] session._wait_for_task(vmdk_copy_task) [ 1861.667889] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1861.667889] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] return self.wait_for_task(task_ref) [ 1861.667889] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1861.667889] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] return evt.wait() [ 1861.667889] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1861.667889] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] result = hub.switch() [ 1861.667889] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1861.667889] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] return self.greenlet.switch() [ 1861.667889] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1861.667889] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] self.f(*self.args, **self.kw) [ 1861.667889] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1861.667889] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] raise exceptions.translate_fault(task_info.error) [ 1861.667889] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1861.667889] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Faults: ['InvalidArgument'] [ 1861.667889] env[67893]: ERROR nova.compute.manager [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] [ 1861.668895] env[67893]: DEBUG nova.compute.utils [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1861.669906] env[67893]: DEBUG nova.compute.manager [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Build of instance 021f1a86-6015-4a22-b501-3ec9079edbec was re-scheduled: A specified parameter was not correct: fileType [ 1861.669906] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1861.670292] env[67893]: DEBUG nova.compute.manager [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1861.670530] env[67893]: DEBUG nova.compute.manager [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1861.670718] env[67893]: DEBUG nova.compute.manager [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1861.670918] env[67893]: DEBUG nova.network.neutron [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1861.974164] env[67893]: DEBUG nova.network.neutron [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1861.988077] env[67893]: INFO nova.compute.manager [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Took 0.32 seconds to deallocate network for instance. [ 1862.087243] env[67893]: INFO nova.scheduler.client.report [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Deleted allocations for instance 021f1a86-6015-4a22-b501-3ec9079edbec [ 1862.108495] env[67893]: DEBUG oslo_concurrency.lockutils [None req-456c50c3-33c8-4188-a6f1-aec03c6c876c tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Lock "021f1a86-6015-4a22-b501-3ec9079edbec" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 625.220s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1862.110131] env[67893]: DEBUG oslo_concurrency.lockutils [None req-8e476051-6816-46b8-a83e-a9a07d91f0c5 tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Lock "021f1a86-6015-4a22-b501-3ec9079edbec" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 428.532s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1862.110867] env[67893]: DEBUG oslo_concurrency.lockutils [None req-8e476051-6816-46b8-a83e-a9a07d91f0c5 tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Acquiring lock "021f1a86-6015-4a22-b501-3ec9079edbec-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1862.110867] env[67893]: DEBUG oslo_concurrency.lockutils [None req-8e476051-6816-46b8-a83e-a9a07d91f0c5 tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Lock "021f1a86-6015-4a22-b501-3ec9079edbec-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1862.110867] env[67893]: DEBUG oslo_concurrency.lockutils [None req-8e476051-6816-46b8-a83e-a9a07d91f0c5 tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Lock "021f1a86-6015-4a22-b501-3ec9079edbec-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1862.112632] env[67893]: INFO nova.compute.manager [None req-8e476051-6816-46b8-a83e-a9a07d91f0c5 tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Terminating instance [ 1862.115192] env[67893]: DEBUG nova.compute.manager [None req-8e476051-6816-46b8-a83e-a9a07d91f0c5 tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1862.115192] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-8e476051-6816-46b8-a83e-a9a07d91f0c5 tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1862.115192] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-9649e352-0b53-40c6-9514-9927d74d28f3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1862.120298] env[67893]: DEBUG nova.compute.manager [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1862.127321] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef6b56e1-462f-4240-b1bf-ad61752c0a78 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1862.156464] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-8e476051-6816-46b8-a83e-a9a07d91f0c5 tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 021f1a86-6015-4a22-b501-3ec9079edbec could not be found. [ 1862.156678] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-8e476051-6816-46b8-a83e-a9a07d91f0c5 tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1862.156852] env[67893]: INFO nova.compute.manager [None req-8e476051-6816-46b8-a83e-a9a07d91f0c5 tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1862.157115] env[67893]: DEBUG oslo.service.loopingcall [None req-8e476051-6816-46b8-a83e-a9a07d91f0c5 tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1862.159362] env[67893]: DEBUG nova.compute.manager [-] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1862.159746] env[67893]: DEBUG nova.network.neutron [-] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1862.173843] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1862.174098] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1862.175518] env[67893]: INFO nova.compute.claims [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1862.185129] env[67893]: DEBUG nova.network.neutron [-] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1862.206153] env[67893]: INFO nova.compute.manager [-] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] Took 0.05 seconds to deallocate network for instance. [ 1862.298250] env[67893]: DEBUG oslo_concurrency.lockutils [None req-8e476051-6816-46b8-a83e-a9a07d91f0c5 tempest-AttachVolumeShelveTestJSON-1186050232 tempest-AttachVolumeShelveTestJSON-1186050232-project-member] Lock "021f1a86-6015-4a22-b501-3ec9079edbec" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.188s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1862.299682] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "021f1a86-6015-4a22-b501-3ec9079edbec" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 57.929s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1862.299874] env[67893]: INFO nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 021f1a86-6015-4a22-b501-3ec9079edbec] During sync_power_state the instance has a pending task (deleting). Skip. [ 1862.300057] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "021f1a86-6015-4a22-b501-3ec9079edbec" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1862.365584] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23fb58b6-358f-4d04-9b64-bb105ced15fd {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1862.373070] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-933b5d09-718a-4343-961e-9716d092fe0c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1862.403388] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56ec8b11-bc94-4ec0-9818-75052a2a1b29 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1862.410571] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2bf399f-6b69-437c-8389-0a2ee46804a5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1862.423486] env[67893]: DEBUG nova.compute.provider_tree [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1862.432155] env[67893]: DEBUG nova.scheduler.client.report [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1862.445428] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.271s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1862.445877] env[67893]: DEBUG nova.compute.manager [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1862.475223] env[67893]: DEBUG nova.compute.utils [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1862.476622] env[67893]: DEBUG nova.compute.manager [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1862.476790] env[67893]: DEBUG nova.network.neutron [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1862.486016] env[67893]: DEBUG nova.compute.manager [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1862.545570] env[67893]: DEBUG nova.compute.manager [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1862.552763] env[67893]: DEBUG nova.policy [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd1016091f2ab4fe69bcf52e8f536bc32', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '88e59a371b0d4dedb303e9b7f6d69b9d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 1862.569878] env[67893]: DEBUG nova.virt.hardware [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1862.570138] env[67893]: DEBUG nova.virt.hardware [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1862.570297] env[67893]: DEBUG nova.virt.hardware [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1862.570502] env[67893]: DEBUG nova.virt.hardware [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1862.570658] env[67893]: DEBUG nova.virt.hardware [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1862.570823] env[67893]: DEBUG nova.virt.hardware [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1862.571049] env[67893]: DEBUG nova.virt.hardware [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1862.571250] env[67893]: DEBUG nova.virt.hardware [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1862.571450] env[67893]: DEBUG nova.virt.hardware [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1862.571648] env[67893]: DEBUG nova.virt.hardware [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1862.571829] env[67893]: DEBUG nova.virt.hardware [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1862.572711] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45edc478-d6cc-4b0c-add0-795051fd1038 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1862.581193] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d679bc0-11bc-4ee1-b7dc-55dd87cc29ed {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1862.895760] env[67893]: DEBUG nova.network.neutron [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Successfully created port: e7630305-aebb-429d-a077-ef3feca6ddf0 {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1863.539625] env[67893]: DEBUG nova.network.neutron [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Successfully updated port: e7630305-aebb-429d-a077-ef3feca6ddf0 {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1863.559011] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquiring lock "refresh_cache-11000d92-0094-4561-a807-ca76610ea549" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1863.559011] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquired lock "refresh_cache-11000d92-0094-4561-a807-ca76610ea549" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1863.559011] env[67893]: DEBUG nova.network.neutron [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1863.597430] env[67893]: DEBUG nova.network.neutron [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1863.790533] env[67893]: DEBUG nova.network.neutron [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Updating instance_info_cache with network_info: [{"id": "e7630305-aebb-429d-a077-ef3feca6ddf0", "address": "fa:16:3e:48:2f:87", "network": {"id": "f5f37611-ef93-4a5d-8b1c-169af83eb7a6", "bridge": "br-int", "label": "tempest-ServersTestJSON-1801901943-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "88e59a371b0d4dedb303e9b7f6d69b9d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8c3e2368-4a35-4aa5-9135-23daedbbf9ef", "external-id": "nsx-vlan-transportzone-125", "segmentation_id": 125, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape7630305-ae", "ovs_interfaceid": "e7630305-aebb-429d-a077-ef3feca6ddf0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1863.804304] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Releasing lock "refresh_cache-11000d92-0094-4561-a807-ca76610ea549" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1863.804579] env[67893]: DEBUG nova.compute.manager [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Instance network_info: |[{"id": "e7630305-aebb-429d-a077-ef3feca6ddf0", "address": "fa:16:3e:48:2f:87", "network": {"id": "f5f37611-ef93-4a5d-8b1c-169af83eb7a6", "bridge": "br-int", "label": "tempest-ServersTestJSON-1801901943-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "88e59a371b0d4dedb303e9b7f6d69b9d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8c3e2368-4a35-4aa5-9135-23daedbbf9ef", "external-id": "nsx-vlan-transportzone-125", "segmentation_id": 125, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape7630305-ae", "ovs_interfaceid": "e7630305-aebb-429d-a077-ef3feca6ddf0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1863.805192] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:48:2f:87', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8c3e2368-4a35-4aa5-9135-23daedbbf9ef', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e7630305-aebb-429d-a077-ef3feca6ddf0', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1863.812418] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Creating folder: Project (88e59a371b0d4dedb303e9b7f6d69b9d). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1863.812928] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c967e49d-752c-46fc-bf3d-8fcd699324ad {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1863.824125] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Created folder: Project (88e59a371b0d4dedb303e9b7f6d69b9d) in parent group-v689771. [ 1863.824312] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Creating folder: Instances. Parent ref: group-v689873. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1863.824522] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-368c96f9-a973-4238-9d21-0050a8766b8b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1863.833380] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Created folder: Instances in parent group-v689873. [ 1863.833599] env[67893]: DEBUG oslo.service.loopingcall [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1863.833774] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 11000d92-0094-4561-a807-ca76610ea549] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1863.833957] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-7cfa7d7f-0761-4679-92a6-88b465a39ca9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1863.863073] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1863.863073] env[67893]: value = "task-3455481" [ 1863.863073] env[67893]: _type = "Task" [ 1863.863073] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1863.870675] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455481, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1864.016782] env[67893]: DEBUG nova.compute.manager [req-6cd9df8e-468d-46e0-937f-4f41b04de91c req-ce1e53c7-1e1c-4f26-a39d-0c97db42aa7f service nova] [instance: 11000d92-0094-4561-a807-ca76610ea549] Received event network-vif-plugged-e7630305-aebb-429d-a077-ef3feca6ddf0 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1864.017137] env[67893]: DEBUG oslo_concurrency.lockutils [req-6cd9df8e-468d-46e0-937f-4f41b04de91c req-ce1e53c7-1e1c-4f26-a39d-0c97db42aa7f service nova] Acquiring lock "11000d92-0094-4561-a807-ca76610ea549-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1864.017396] env[67893]: DEBUG oslo_concurrency.lockutils [req-6cd9df8e-468d-46e0-937f-4f41b04de91c req-ce1e53c7-1e1c-4f26-a39d-0c97db42aa7f service nova] Lock "11000d92-0094-4561-a807-ca76610ea549-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1864.017622] env[67893]: DEBUG oslo_concurrency.lockutils [req-6cd9df8e-468d-46e0-937f-4f41b04de91c req-ce1e53c7-1e1c-4f26-a39d-0c97db42aa7f service nova] Lock "11000d92-0094-4561-a807-ca76610ea549-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1864.017856] env[67893]: DEBUG nova.compute.manager [req-6cd9df8e-468d-46e0-937f-4f41b04de91c req-ce1e53c7-1e1c-4f26-a39d-0c97db42aa7f service nova] [instance: 11000d92-0094-4561-a807-ca76610ea549] No waiting events found dispatching network-vif-plugged-e7630305-aebb-429d-a077-ef3feca6ddf0 {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1864.018077] env[67893]: WARNING nova.compute.manager [req-6cd9df8e-468d-46e0-937f-4f41b04de91c req-ce1e53c7-1e1c-4f26-a39d-0c97db42aa7f service nova] [instance: 11000d92-0094-4561-a807-ca76610ea549] Received unexpected event network-vif-plugged-e7630305-aebb-429d-a077-ef3feca6ddf0 for instance with vm_state building and task_state spawning. [ 1864.018277] env[67893]: DEBUG nova.compute.manager [req-6cd9df8e-468d-46e0-937f-4f41b04de91c req-ce1e53c7-1e1c-4f26-a39d-0c97db42aa7f service nova] [instance: 11000d92-0094-4561-a807-ca76610ea549] Received event network-changed-e7630305-aebb-429d-a077-ef3feca6ddf0 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1864.018632] env[67893]: DEBUG nova.compute.manager [req-6cd9df8e-468d-46e0-937f-4f41b04de91c req-ce1e53c7-1e1c-4f26-a39d-0c97db42aa7f service nova] [instance: 11000d92-0094-4561-a807-ca76610ea549] Refreshing instance network info cache due to event network-changed-e7630305-aebb-429d-a077-ef3feca6ddf0. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1864.018877] env[67893]: DEBUG oslo_concurrency.lockutils [req-6cd9df8e-468d-46e0-937f-4f41b04de91c req-ce1e53c7-1e1c-4f26-a39d-0c97db42aa7f service nova] Acquiring lock "refresh_cache-11000d92-0094-4561-a807-ca76610ea549" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1864.019043] env[67893]: DEBUG oslo_concurrency.lockutils [req-6cd9df8e-468d-46e0-937f-4f41b04de91c req-ce1e53c7-1e1c-4f26-a39d-0c97db42aa7f service nova] Acquired lock "refresh_cache-11000d92-0094-4561-a807-ca76610ea549" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1864.019213] env[67893]: DEBUG nova.network.neutron [req-6cd9df8e-468d-46e0-937f-4f41b04de91c req-ce1e53c7-1e1c-4f26-a39d-0c97db42aa7f service nova] [instance: 11000d92-0094-4561-a807-ca76610ea549] Refreshing network info cache for port e7630305-aebb-429d-a077-ef3feca6ddf0 {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1864.278965] env[67893]: DEBUG nova.network.neutron [req-6cd9df8e-468d-46e0-937f-4f41b04de91c req-ce1e53c7-1e1c-4f26-a39d-0c97db42aa7f service nova] [instance: 11000d92-0094-4561-a807-ca76610ea549] Updated VIF entry in instance network info cache for port e7630305-aebb-429d-a077-ef3feca6ddf0. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1864.279347] env[67893]: DEBUG nova.network.neutron [req-6cd9df8e-468d-46e0-937f-4f41b04de91c req-ce1e53c7-1e1c-4f26-a39d-0c97db42aa7f service nova] [instance: 11000d92-0094-4561-a807-ca76610ea549] Updating instance_info_cache with network_info: [{"id": "e7630305-aebb-429d-a077-ef3feca6ddf0", "address": "fa:16:3e:48:2f:87", "network": {"id": "f5f37611-ef93-4a5d-8b1c-169af83eb7a6", "bridge": "br-int", "label": "tempest-ServersTestJSON-1801901943-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "88e59a371b0d4dedb303e9b7f6d69b9d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8c3e2368-4a35-4aa5-9135-23daedbbf9ef", "external-id": "nsx-vlan-transportzone-125", "segmentation_id": 125, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape7630305-ae", "ovs_interfaceid": "e7630305-aebb-429d-a077-ef3feca6ddf0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1864.288382] env[67893]: DEBUG oslo_concurrency.lockutils [req-6cd9df8e-468d-46e0-937f-4f41b04de91c req-ce1e53c7-1e1c-4f26-a39d-0c97db42aa7f service nova] Releasing lock "refresh_cache-11000d92-0094-4561-a807-ca76610ea549" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1864.373379] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455481, 'name': CreateVM_Task, 'duration_secs': 0.299629} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1864.373551] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 11000d92-0094-4561-a807-ca76610ea549] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1864.374221] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1864.374388] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1864.374710] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1864.374964] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9ebe99e1-03b2-4b4f-8d36-97ffcbc7802e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1864.379380] env[67893]: DEBUG oslo_vmware.api [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Waiting for the task: (returnval){ [ 1864.379380] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52ccfcba-e876-0bfd-f171-a5bf0cc62b30" [ 1864.379380] env[67893]: _type = "Task" [ 1864.379380] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1864.387079] env[67893]: DEBUG oslo_vmware.api [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52ccfcba-e876-0bfd-f171-a5bf0cc62b30, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1864.889870] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1864.890177] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1864.890351] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1870.493497] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Acquiring lock "a5151a22-4174-4f66-a83a-55a0dd01c407" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1870.493857] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Lock "a5151a22-4174-4f66-a83a-55a0dd01c407" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1878.429590] env[67893]: DEBUG oslo_concurrency.lockutils [None req-2c71fbb1-0b8b-4f5f-a6b8-56cc9b463295 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Acquiring lock "94760898-4f3c-4f41-85be-366f4108d0ba" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1881.719943] env[67893]: DEBUG oslo_concurrency.lockutils [None req-6b0ef2be-21ec-4e27-881d-14b43e209680 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquiring lock "11000d92-0094-4561-a807-ca76610ea549" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1908.910865] env[67893]: WARNING oslo_vmware.rw_handles [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1908.910865] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1908.910865] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1908.910865] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1908.910865] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1908.910865] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 1908.910865] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1908.910865] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1908.910865] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1908.910865] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1908.910865] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1908.910865] env[67893]: ERROR oslo_vmware.rw_handles [ 1908.911545] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/c93074ab-f5e9-4e47-93ff-eacfd1fe3fb4/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1908.913420] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1908.913686] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Copying Virtual Disk [datastore1] vmware_temp/c93074ab-f5e9-4e47-93ff-eacfd1fe3fb4/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/c93074ab-f5e9-4e47-93ff-eacfd1fe3fb4/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1908.914023] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-bf1bed17-7f4f-4bab-8830-ab061571b2af {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1908.922914] env[67893]: DEBUG oslo_vmware.api [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Waiting for the task: (returnval){ [ 1908.922914] env[67893]: value = "task-3455482" [ 1908.922914] env[67893]: _type = "Task" [ 1908.922914] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1908.930824] env[67893]: DEBUG oslo_vmware.api [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Task: {'id': task-3455482, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1909.398983] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1909.432461] env[67893]: DEBUG oslo_vmware.exceptions [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1909.432684] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1909.433272] env[67893]: ERROR nova.compute.manager [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1909.433272] env[67893]: Faults: ['InvalidArgument'] [ 1909.433272] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] Traceback (most recent call last): [ 1909.433272] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1909.433272] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] yield resources [ 1909.433272] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1909.433272] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] self.driver.spawn(context, instance, image_meta, [ 1909.433272] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1909.433272] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1909.433272] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1909.433272] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] self._fetch_image_if_missing(context, vi) [ 1909.433272] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1909.433272] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] image_cache(vi, tmp_image_ds_loc) [ 1909.433272] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1909.433272] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] vm_util.copy_virtual_disk( [ 1909.433272] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1909.433272] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] session._wait_for_task(vmdk_copy_task) [ 1909.433272] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1909.433272] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] return self.wait_for_task(task_ref) [ 1909.433272] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1909.433272] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] return evt.wait() [ 1909.433272] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1909.433272] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] result = hub.switch() [ 1909.433272] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1909.433272] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] return self.greenlet.switch() [ 1909.433272] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1909.433272] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] self.f(*self.args, **self.kw) [ 1909.433272] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1909.433272] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] raise exceptions.translate_fault(task_info.error) [ 1909.433272] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1909.433272] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] Faults: ['InvalidArgument'] [ 1909.433272] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] [ 1909.434571] env[67893]: INFO nova.compute.manager [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Terminating instance [ 1909.435116] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1909.435330] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1909.435574] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4a29b7e1-ec8c-4432-9296-6f90b2560bd3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1909.437881] env[67893]: DEBUG nova.compute.manager [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1909.438087] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1909.438795] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d5944d6-88bd-43b5-b745-c350de4eee63 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1909.445039] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1909.445262] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-116deb54-7d2e-4b6f-8dfa-ec2b54f2a212 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1909.447487] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1909.447659] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1909.448629] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-022aa57b-366b-4670-9a13-4e7501611424 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1909.453443] env[67893]: DEBUG oslo_vmware.api [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Waiting for the task: (returnval){ [ 1909.453443] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52fbef05-1cc0-5980-9937-32e4f8c5bf20" [ 1909.453443] env[67893]: _type = "Task" [ 1909.453443] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1909.460353] env[67893]: DEBUG oslo_vmware.api [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52fbef05-1cc0-5980-9937-32e4f8c5bf20, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1909.514865] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1909.515096] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1909.515278] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Deleting the datastore file [datastore1] 25d67f98-c132-434b-9d22-4569585527eb {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1909.515534] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-0c2e8f0d-90bc-495f-a831-4dead0f6f41b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1909.521580] env[67893]: DEBUG oslo_vmware.api [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Waiting for the task: (returnval){ [ 1909.521580] env[67893]: value = "task-3455484" [ 1909.521580] env[67893]: _type = "Task" [ 1909.521580] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1909.528901] env[67893]: DEBUG oslo_vmware.api [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Task: {'id': task-3455484, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1909.963574] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1909.963883] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Creating directory with path [datastore1] vmware_temp/2cecd939-d1c5-417f-9b2b-91a1c1b42617/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1909.964034] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-446734be-0d02-4984-9cdf-550f26298c2f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1909.974882] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Created directory with path [datastore1] vmware_temp/2cecd939-d1c5-417f-9b2b-91a1c1b42617/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1909.975083] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Fetch image to [datastore1] vmware_temp/2cecd939-d1c5-417f-9b2b-91a1c1b42617/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1909.975255] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/2cecd939-d1c5-417f-9b2b-91a1c1b42617/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1909.975937] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74462997-92f1-4d0a-bc74-0087a4d4dbaa {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1909.982167] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c921d7c-47dc-4ee3-af63-a3db60922e8a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1909.991866] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d6da0e3-8e3b-41fd-b315-e82455be3cf3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1910.021506] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0153848d-e7f5-4e0e-8be0-55bdbe9ad968 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1910.031906] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-27a75274-c983-4164-8dfb-de81f83087be {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1910.033502] env[67893]: DEBUG oslo_vmware.api [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Task: {'id': task-3455484, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079114} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1910.033734] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1910.033908] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1910.034096] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1910.034269] env[67893]: INFO nova.compute.manager [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1910.036314] env[67893]: DEBUG nova.compute.claims [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1910.036491] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1910.036758] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1910.054618] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1910.202793] env[67893]: DEBUG oslo_vmware.rw_handles [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2cecd939-d1c5-417f-9b2b-91a1c1b42617/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1910.263947] env[67893]: DEBUG oslo_vmware.rw_handles [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1910.264162] env[67893]: DEBUG oslo_vmware.rw_handles [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2cecd939-d1c5-417f-9b2b-91a1c1b42617/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1910.295079] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41ebe8e1-df64-4579-8dc0-0bc0dcf62080 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1910.302564] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc95f206-90bf-40dc-beef-a6e3e839e757 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1910.332484] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5be70043-69fe-48a5-b9af-1c0a3353ac2b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1910.339319] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0cc43ed-195f-4ce4-9b44-4d759a9ea582 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1910.352747] env[67893]: DEBUG nova.compute.provider_tree [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1910.363080] env[67893]: DEBUG nova.scheduler.client.report [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1910.379651] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.343s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1910.380216] env[67893]: ERROR nova.compute.manager [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1910.380216] env[67893]: Faults: ['InvalidArgument'] [ 1910.380216] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] Traceback (most recent call last): [ 1910.380216] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1910.380216] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] self.driver.spawn(context, instance, image_meta, [ 1910.380216] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1910.380216] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1910.380216] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1910.380216] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] self._fetch_image_if_missing(context, vi) [ 1910.380216] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1910.380216] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] image_cache(vi, tmp_image_ds_loc) [ 1910.380216] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1910.380216] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] vm_util.copy_virtual_disk( [ 1910.380216] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1910.380216] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] session._wait_for_task(vmdk_copy_task) [ 1910.380216] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1910.380216] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] return self.wait_for_task(task_ref) [ 1910.380216] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1910.380216] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] return evt.wait() [ 1910.380216] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1910.380216] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] result = hub.switch() [ 1910.380216] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1910.380216] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] return self.greenlet.switch() [ 1910.380216] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1910.380216] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] self.f(*self.args, **self.kw) [ 1910.380216] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1910.380216] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] raise exceptions.translate_fault(task_info.error) [ 1910.380216] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1910.380216] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] Faults: ['InvalidArgument'] [ 1910.380216] env[67893]: ERROR nova.compute.manager [instance: 25d67f98-c132-434b-9d22-4569585527eb] [ 1910.381151] env[67893]: DEBUG nova.compute.utils [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1910.382712] env[67893]: DEBUG nova.compute.manager [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Build of instance 25d67f98-c132-434b-9d22-4569585527eb was re-scheduled: A specified parameter was not correct: fileType [ 1910.382712] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1910.383115] env[67893]: DEBUG nova.compute.manager [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1910.383320] env[67893]: DEBUG nova.compute.manager [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1910.383500] env[67893]: DEBUG nova.compute.manager [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1910.383665] env[67893]: DEBUG nova.network.neutron [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1910.679686] env[67893]: DEBUG nova.network.neutron [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1910.692722] env[67893]: INFO nova.compute.manager [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Took 0.31 seconds to deallocate network for instance. [ 1910.783235] env[67893]: INFO nova.scheduler.client.report [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Deleted allocations for instance 25d67f98-c132-434b-9d22-4569585527eb [ 1910.808799] env[67893]: DEBUG oslo_concurrency.lockutils [None req-4778423b-4e1f-4082-839b-fb4313770362 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Lock "25d67f98-c132-434b-9d22-4569585527eb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 657.902s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1910.809694] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3670467c-6b56-49cb-bf6f-95380ba549d9 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Lock "25d67f98-c132-434b-9d22-4569585527eb" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 461.746s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1910.810367] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3670467c-6b56-49cb-bf6f-95380ba549d9 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Acquiring lock "25d67f98-c132-434b-9d22-4569585527eb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1910.810367] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3670467c-6b56-49cb-bf6f-95380ba549d9 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Lock "25d67f98-c132-434b-9d22-4569585527eb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1910.810367] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3670467c-6b56-49cb-bf6f-95380ba549d9 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Lock "25d67f98-c132-434b-9d22-4569585527eb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1910.812900] env[67893]: INFO nova.compute.manager [None req-3670467c-6b56-49cb-bf6f-95380ba549d9 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Terminating instance [ 1910.814749] env[67893]: DEBUG nova.compute.manager [None req-3670467c-6b56-49cb-bf6f-95380ba549d9 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1910.814951] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3670467c-6b56-49cb-bf6f-95380ba549d9 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1910.815434] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-caaeeb1a-e216-4370-8d8a-b6ad065084d5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1910.824580] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f66fb01-1b64-4181-aeee-9accf9190dd7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1910.836750] env[67893]: DEBUG nova.compute.manager [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1910.858381] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-3670467c-6b56-49cb-bf6f-95380ba549d9 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 25d67f98-c132-434b-9d22-4569585527eb could not be found. [ 1910.858381] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3670467c-6b56-49cb-bf6f-95380ba549d9 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1910.858532] env[67893]: INFO nova.compute.manager [None req-3670467c-6b56-49cb-bf6f-95380ba549d9 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1910.858797] env[67893]: DEBUG oslo.service.loopingcall [None req-3670467c-6b56-49cb-bf6f-95380ba549d9 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1910.859014] env[67893]: DEBUG nova.compute.manager [-] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1910.860033] env[67893]: DEBUG nova.network.neutron [-] [instance: 25d67f98-c132-434b-9d22-4569585527eb] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1910.894242] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1910.894490] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1910.896046] env[67893]: INFO nova.compute.claims [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1910.905163] env[67893]: DEBUG nova.network.neutron [-] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1910.911606] env[67893]: INFO nova.compute.manager [-] [instance: 25d67f98-c132-434b-9d22-4569585527eb] Took 0.05 seconds to deallocate network for instance. [ 1911.004639] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3670467c-6b56-49cb-bf6f-95380ba549d9 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Lock "25d67f98-c132-434b-9d22-4569585527eb" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.195s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1911.005486] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "25d67f98-c132-434b-9d22-4569585527eb" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 106.634s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1911.005836] env[67893]: INFO nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 25d67f98-c132-434b-9d22-4569585527eb] During sync_power_state the instance has a pending task (deleting). Skip. [ 1911.005836] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "25d67f98-c132-434b-9d22-4569585527eb" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1911.102923] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff402bef-3b79-42bc-889f-e4d67b10f990 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1911.110618] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e580b998-2e87-4439-8cab-105416693343 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1911.139055] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32da0bec-a6c1-4ea2-9c68-e72b531058d3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1911.145493] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b3e4cb1-1628-4c24-a3c8-75aecaa8cd47 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1911.158161] env[67893]: DEBUG nova.compute.provider_tree [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1911.166714] env[67893]: DEBUG nova.scheduler.client.report [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1911.181622] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.287s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1911.182103] env[67893]: DEBUG nova.compute.manager [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1911.212162] env[67893]: DEBUG nova.compute.utils [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1911.213615] env[67893]: DEBUG nova.compute.manager [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1911.213615] env[67893]: DEBUG nova.network.neutron [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1911.221235] env[67893]: DEBUG nova.compute.manager [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1911.277179] env[67893]: DEBUG nova.policy [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9115f73c22bf4b0e9e5439363832061d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7a19d9bde3814325847c06cec1af09b7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 1911.282653] env[67893]: DEBUG nova.compute.manager [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1911.309597] env[67893]: DEBUG nova.virt.hardware [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1911.310560] env[67893]: DEBUG nova.virt.hardware [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1911.310560] env[67893]: DEBUG nova.virt.hardware [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1911.310560] env[67893]: DEBUG nova.virt.hardware [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1911.310560] env[67893]: DEBUG nova.virt.hardware [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1911.310560] env[67893]: DEBUG nova.virt.hardware [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1911.310803] env[67893]: DEBUG nova.virt.hardware [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1911.310803] env[67893]: DEBUG nova.virt.hardware [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1911.310983] env[67893]: DEBUG nova.virt.hardware [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1911.311165] env[67893]: DEBUG nova.virt.hardware [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1911.311342] env[67893]: DEBUG nova.virt.hardware [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1911.312197] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2831ddd-4c07-4f20-9984-3b3a0fe3fbe5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1911.320062] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9db42fe7-11ea-44a8-92f6-b2aeb1a10d79 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1911.585730] env[67893]: DEBUG nova.network.neutron [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Successfully created port: ec47efbb-18e4-43bb-94b9-89195667691a {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1912.284969] env[67893]: DEBUG nova.network.neutron [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Successfully updated port: ec47efbb-18e4-43bb-94b9-89195667691a {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1912.296598] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "refresh_cache-7169c720-f69e-40a3-95d2-473639884cd9" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1912.296748] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquired lock "refresh_cache-7169c720-f69e-40a3-95d2-473639884cd9" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1912.296901] env[67893]: DEBUG nova.network.neutron [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1912.334513] env[67893]: DEBUG nova.network.neutron [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1912.490130] env[67893]: DEBUG nova.network.neutron [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Updating instance_info_cache with network_info: [{"id": "ec47efbb-18e4-43bb-94b9-89195667691a", "address": "fa:16:3e:ac:1e:37", "network": {"id": "b5038471-f3b2-4f1f-b2f9-62effa71f1aa", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1405799721-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7a19d9bde3814325847c06cec1af09b7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "89ef02af-c508-432f-ae29-3a219701d584", "external-id": "nsx-vlan-transportzone-313", "segmentation_id": 313, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapec47efbb-18", "ovs_interfaceid": "ec47efbb-18e4-43bb-94b9-89195667691a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1912.501018] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Releasing lock "refresh_cache-7169c720-f69e-40a3-95d2-473639884cd9" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1912.501326] env[67893]: DEBUG nova.compute.manager [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Instance network_info: |[{"id": "ec47efbb-18e4-43bb-94b9-89195667691a", "address": "fa:16:3e:ac:1e:37", "network": {"id": "b5038471-f3b2-4f1f-b2f9-62effa71f1aa", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1405799721-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7a19d9bde3814325847c06cec1af09b7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "89ef02af-c508-432f-ae29-3a219701d584", "external-id": "nsx-vlan-transportzone-313", "segmentation_id": 313, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapec47efbb-18", "ovs_interfaceid": "ec47efbb-18e4-43bb-94b9-89195667691a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1912.501709] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ac:1e:37', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '89ef02af-c508-432f-ae29-3a219701d584', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ec47efbb-18e4-43bb-94b9-89195667691a', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1912.509271] env[67893]: DEBUG oslo.service.loopingcall [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1912.509716] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1912.509943] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1d0caf5b-28c5-44df-94a9-e1addb738cad {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1912.530603] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1912.530603] env[67893]: value = "task-3455485" [ 1912.530603] env[67893]: _type = "Task" [ 1912.530603] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1912.538953] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455485, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1912.714239] env[67893]: DEBUG nova.compute.manager [req-a2794018-a6b4-465e-b26f-c24c1eeeceb6 req-3bdd5f17-68f3-4434-acd2-a612fe5abf45 service nova] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Received event network-vif-plugged-ec47efbb-18e4-43bb-94b9-89195667691a {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1912.714476] env[67893]: DEBUG oslo_concurrency.lockutils [req-a2794018-a6b4-465e-b26f-c24c1eeeceb6 req-3bdd5f17-68f3-4434-acd2-a612fe5abf45 service nova] Acquiring lock "7169c720-f69e-40a3-95d2-473639884cd9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1912.714690] env[67893]: DEBUG oslo_concurrency.lockutils [req-a2794018-a6b4-465e-b26f-c24c1eeeceb6 req-3bdd5f17-68f3-4434-acd2-a612fe5abf45 service nova] Lock "7169c720-f69e-40a3-95d2-473639884cd9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1912.714859] env[67893]: DEBUG oslo_concurrency.lockutils [req-a2794018-a6b4-465e-b26f-c24c1eeeceb6 req-3bdd5f17-68f3-4434-acd2-a612fe5abf45 service nova] Lock "7169c720-f69e-40a3-95d2-473639884cd9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1912.715043] env[67893]: DEBUG nova.compute.manager [req-a2794018-a6b4-465e-b26f-c24c1eeeceb6 req-3bdd5f17-68f3-4434-acd2-a612fe5abf45 service nova] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] No waiting events found dispatching network-vif-plugged-ec47efbb-18e4-43bb-94b9-89195667691a {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1912.715212] env[67893]: WARNING nova.compute.manager [req-a2794018-a6b4-465e-b26f-c24c1eeeceb6 req-3bdd5f17-68f3-4434-acd2-a612fe5abf45 service nova] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Received unexpected event network-vif-plugged-ec47efbb-18e4-43bb-94b9-89195667691a for instance with vm_state building and task_state spawning. [ 1912.715373] env[67893]: DEBUG nova.compute.manager [req-a2794018-a6b4-465e-b26f-c24c1eeeceb6 req-3bdd5f17-68f3-4434-acd2-a612fe5abf45 service nova] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Received event network-changed-ec47efbb-18e4-43bb-94b9-89195667691a {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1912.715526] env[67893]: DEBUG nova.compute.manager [req-a2794018-a6b4-465e-b26f-c24c1eeeceb6 req-3bdd5f17-68f3-4434-acd2-a612fe5abf45 service nova] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Refreshing instance network info cache due to event network-changed-ec47efbb-18e4-43bb-94b9-89195667691a. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1912.715758] env[67893]: DEBUG oslo_concurrency.lockutils [req-a2794018-a6b4-465e-b26f-c24c1eeeceb6 req-3bdd5f17-68f3-4434-acd2-a612fe5abf45 service nova] Acquiring lock "refresh_cache-7169c720-f69e-40a3-95d2-473639884cd9" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1912.715919] env[67893]: DEBUG oslo_concurrency.lockutils [req-a2794018-a6b4-465e-b26f-c24c1eeeceb6 req-3bdd5f17-68f3-4434-acd2-a612fe5abf45 service nova] Acquired lock "refresh_cache-7169c720-f69e-40a3-95d2-473639884cd9" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1912.716093] env[67893]: DEBUG nova.network.neutron [req-a2794018-a6b4-465e-b26f-c24c1eeeceb6 req-3bdd5f17-68f3-4434-acd2-a612fe5abf45 service nova] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Refreshing network info cache for port ec47efbb-18e4-43bb-94b9-89195667691a {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1913.040506] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455485, 'name': CreateVM_Task, 'duration_secs': 0.378668} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1913.040668] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1913.049888] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1913.050068] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1913.050374] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1913.050620] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5df38dc2-f12e-468f-b113-5b1cb563f9bf {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1913.054976] env[67893]: DEBUG oslo_vmware.api [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Waiting for the task: (returnval){ [ 1913.054976] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52ccb855-426e-f5c1-83d3-c271a5ed5982" [ 1913.054976] env[67893]: _type = "Task" [ 1913.054976] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1913.062378] env[67893]: DEBUG oslo_vmware.api [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52ccb855-426e-f5c1-83d3-c271a5ed5982, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1913.171518] env[67893]: DEBUG nova.network.neutron [req-a2794018-a6b4-465e-b26f-c24c1eeeceb6 req-3bdd5f17-68f3-4434-acd2-a612fe5abf45 service nova] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Updated VIF entry in instance network info cache for port ec47efbb-18e4-43bb-94b9-89195667691a. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1913.171876] env[67893]: DEBUG nova.network.neutron [req-a2794018-a6b4-465e-b26f-c24c1eeeceb6 req-3bdd5f17-68f3-4434-acd2-a612fe5abf45 service nova] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Updating instance_info_cache with network_info: [{"id": "ec47efbb-18e4-43bb-94b9-89195667691a", "address": "fa:16:3e:ac:1e:37", "network": {"id": "b5038471-f3b2-4f1f-b2f9-62effa71f1aa", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1405799721-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7a19d9bde3814325847c06cec1af09b7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "89ef02af-c508-432f-ae29-3a219701d584", "external-id": "nsx-vlan-transportzone-313", "segmentation_id": 313, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapec47efbb-18", "ovs_interfaceid": "ec47efbb-18e4-43bb-94b9-89195667691a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1913.181670] env[67893]: DEBUG oslo_concurrency.lockutils [req-a2794018-a6b4-465e-b26f-c24c1eeeceb6 req-3bdd5f17-68f3-4434-acd2-a612fe5abf45 service nova] Releasing lock "refresh_cache-7169c720-f69e-40a3-95d2-473639884cd9" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1913.564886] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1913.565269] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1913.565373] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1913.859241] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1913.859422] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1913.859590] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1913.882188] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1913.882406] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1913.882555] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1913.882695] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1913.882826] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1913.882954] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1913.883092] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1913.883251] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1913.883382] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 11000d92-0094-4561-a807-ca76610ea549] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1913.883502] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1913.883623] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1914.879250] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1915.859603] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1916.858077] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1916.858348] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1916.858474] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1916.858613] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1917.860329] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1918.854541] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1921.858363] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1921.869954] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1921.870197] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1921.870367] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1921.870520] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1921.871668] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-715a9aa2-252c-4831-8242-c414cb82d7cf {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1921.880214] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20eadec8-6155-45fd-8a41-f361e37476db {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1921.894252] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80f7c9d9-85f5-4c20-adf0-bf3ef8627448 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1921.901072] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e50daf69-38e8-47a5-97ee-dc2209ba01ce {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1921.929477] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180975MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1921.929704] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1921.930013] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1922.005169] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 41b5c5ec-936a-4abe-9db7-38d0d2aa371d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1922.005340] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance e1849daf-3781-42ef-bede-267efbb652c9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1922.005470] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2875b0a3-0213-4908-b86b-ce45a8901553 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1922.005594] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance dfb92d1c-c2a5-49c1-8526-3743cb385c97 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1922.005713] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance ad60df35-54c0-459e-8a25-981922ae0a88 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1922.005831] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 9fc9f6b0-928e-46b4-ad7c-9217b2f31575 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1922.005948] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 15893b5f-a02a-4ce7-80c9-eea0658f9ac7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1922.006079] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 94760898-4f3c-4f41-85be-366f4108d0ba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1922.006194] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 11000d92-0094-4561-a807-ca76610ea549 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1922.006304] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 7169c720-f69e-40a3-95d2-473639884cd9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1922.019986] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 72410dc2-74d9-4d59-bdd1-ad45b01c482b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1922.029862] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a5151a22-4174-4f66-a83a-55a0dd01c407 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1922.030091] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1922.030242] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1922.176973] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f145fd18-c060-4ebd-aed6-63daaaf4c460 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1922.185940] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13be4017-c77a-4c73-8eba-108d21664957 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1922.214925] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b2771d8-3134-4f51-8b97-f3eb48d883b3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1922.222141] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e87c2248-8481-4273-aab0-faab6a2558e0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1922.234870] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1922.243449] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1922.257897] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1922.258091] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.328s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1958.147494] env[67893]: WARNING oslo_vmware.rw_handles [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1958.147494] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1958.147494] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1958.147494] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1958.147494] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1958.147494] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 1958.147494] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1958.147494] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1958.147494] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1958.147494] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1958.147494] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1958.147494] env[67893]: ERROR oslo_vmware.rw_handles [ 1958.148330] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/2cecd939-d1c5-417f-9b2b-91a1c1b42617/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1958.150086] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1958.150341] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Copying Virtual Disk [datastore1] vmware_temp/2cecd939-d1c5-417f-9b2b-91a1c1b42617/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/2cecd939-d1c5-417f-9b2b-91a1c1b42617/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1958.150664] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-8d2c1aba-8f70-4ba1-8c9b-6819ab79d61a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1958.159451] env[67893]: DEBUG oslo_vmware.api [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Waiting for the task: (returnval){ [ 1958.159451] env[67893]: value = "task-3455486" [ 1958.159451] env[67893]: _type = "Task" [ 1958.159451] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1958.166897] env[67893]: DEBUG oslo_vmware.api [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Task: {'id': task-3455486, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1958.669482] env[67893]: DEBUG oslo_vmware.exceptions [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1958.669802] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1958.670401] env[67893]: ERROR nova.compute.manager [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1958.670401] env[67893]: Faults: ['InvalidArgument'] [ 1958.670401] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Traceback (most recent call last): [ 1958.670401] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1958.670401] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] yield resources [ 1958.670401] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1958.670401] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] self.driver.spawn(context, instance, image_meta, [ 1958.670401] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1958.670401] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1958.670401] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1958.670401] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] self._fetch_image_if_missing(context, vi) [ 1958.670401] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1958.670401] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] image_cache(vi, tmp_image_ds_loc) [ 1958.670401] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1958.670401] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] vm_util.copy_virtual_disk( [ 1958.670401] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1958.670401] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] session._wait_for_task(vmdk_copy_task) [ 1958.670401] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1958.670401] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] return self.wait_for_task(task_ref) [ 1958.670401] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1958.670401] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] return evt.wait() [ 1958.670401] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1958.670401] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] result = hub.switch() [ 1958.670401] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1958.670401] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] return self.greenlet.switch() [ 1958.670401] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1958.670401] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] self.f(*self.args, **self.kw) [ 1958.670401] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1958.670401] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] raise exceptions.translate_fault(task_info.error) [ 1958.670401] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1958.670401] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Faults: ['InvalidArgument'] [ 1958.670401] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] [ 1958.671476] env[67893]: INFO nova.compute.manager [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Terminating instance [ 1958.672393] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1958.672610] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1958.672856] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-75747898-271a-4ad7-8bd1-2f411d2d2d9b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1958.676617] env[67893]: DEBUG nova.compute.manager [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1958.676824] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1958.677599] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f5f571b-adcb-4ee7-bc51-eac92c9cab12 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1958.685208] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1958.685208] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b5f78525-ca71-4a68-b52f-a1666647b371 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1958.687372] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1958.687540] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1958.688552] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fec3c0a1-f6cd-49f1-bd17-66e4f8bf2ba7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1958.693884] env[67893]: DEBUG oslo_vmware.api [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Waiting for the task: (returnval){ [ 1958.693884] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52932e71-131c-5aee-1cc2-0efcbf7a55fa" [ 1958.693884] env[67893]: _type = "Task" [ 1958.693884] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1958.707834] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1958.707970] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Creating directory with path [datastore1] vmware_temp/4344c1af-1de7-4a9c-aa0c-98a3cf4b4469/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1958.708169] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-aa1e5c85-9b7a-4ba0-a312-7ce4dc3038c6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1958.728855] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Created directory with path [datastore1] vmware_temp/4344c1af-1de7-4a9c-aa0c-98a3cf4b4469/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1958.729016] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Fetch image to [datastore1] vmware_temp/4344c1af-1de7-4a9c-aa0c-98a3cf4b4469/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1958.729263] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/4344c1af-1de7-4a9c-aa0c-98a3cf4b4469/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1958.730070] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a726c0ed-efeb-4815-97e6-e49b45da2c23 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1958.737059] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ae8f56b-bdbf-4a12-871d-5b5cbce8eb65 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1958.746301] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96148df0-7c1c-444c-9d6e-2dd7a260721b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1958.778108] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28811968-9f18-436d-9a32-386fe1f67a42 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1958.780739] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1958.780930] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1958.781123] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Deleting the datastore file [datastore1] 41b5c5ec-936a-4abe-9db7-38d0d2aa371d {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1958.781379] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-5d63c3e2-9c98-4a98-afe8-d4c5138b21cc {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1958.787836] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1045e9f8-9581-4f95-a4f0-60b412a66645 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1958.789579] env[67893]: DEBUG oslo_vmware.api [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Waiting for the task: (returnval){ [ 1958.789579] env[67893]: value = "task-3455488" [ 1958.789579] env[67893]: _type = "Task" [ 1958.789579] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1958.797023] env[67893]: DEBUG oslo_vmware.api [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Task: {'id': task-3455488, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1958.816203] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1958.895441] env[67893]: DEBUG oslo_vmware.rw_handles [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4344c1af-1de7-4a9c-aa0c-98a3cf4b4469/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1958.954540] env[67893]: DEBUG oslo_vmware.rw_handles [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1958.954762] env[67893]: DEBUG oslo_vmware.rw_handles [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4344c1af-1de7-4a9c-aa0c-98a3cf4b4469/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1959.299518] env[67893]: DEBUG oslo_vmware.api [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Task: {'id': task-3455488, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068843} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1959.299873] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1959.299927] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1959.300106] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1959.300282] env[67893]: INFO nova.compute.manager [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1959.302354] env[67893]: DEBUG nova.compute.claims [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1959.302521] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1959.302738] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1959.483406] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14356431-ffb1-4cd8-8862-18f2b274db07 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1959.490407] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ae520ed-cb82-4fb8-8872-be3cc1176d9c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1959.521037] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc0fd404-26f1-4605-9e6a-e9718ed3d88d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1959.527993] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c2c62de-e445-46bc-8533-1354f6fd1316 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1959.540558] env[67893]: DEBUG nova.compute.provider_tree [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1959.549321] env[67893]: DEBUG nova.scheduler.client.report [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1959.563122] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.260s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1959.563678] env[67893]: ERROR nova.compute.manager [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1959.563678] env[67893]: Faults: ['InvalidArgument'] [ 1959.563678] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Traceback (most recent call last): [ 1959.563678] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1959.563678] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] self.driver.spawn(context, instance, image_meta, [ 1959.563678] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1959.563678] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1959.563678] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1959.563678] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] self._fetch_image_if_missing(context, vi) [ 1959.563678] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1959.563678] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] image_cache(vi, tmp_image_ds_loc) [ 1959.563678] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1959.563678] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] vm_util.copy_virtual_disk( [ 1959.563678] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1959.563678] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] session._wait_for_task(vmdk_copy_task) [ 1959.563678] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1959.563678] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] return self.wait_for_task(task_ref) [ 1959.563678] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1959.563678] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] return evt.wait() [ 1959.563678] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1959.563678] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] result = hub.switch() [ 1959.563678] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1959.563678] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] return self.greenlet.switch() [ 1959.563678] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1959.563678] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] self.f(*self.args, **self.kw) [ 1959.563678] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1959.563678] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] raise exceptions.translate_fault(task_info.error) [ 1959.563678] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1959.563678] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Faults: ['InvalidArgument'] [ 1959.563678] env[67893]: ERROR nova.compute.manager [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] [ 1959.564701] env[67893]: DEBUG nova.compute.utils [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1959.565687] env[67893]: DEBUG nova.compute.manager [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Build of instance 41b5c5ec-936a-4abe-9db7-38d0d2aa371d was re-scheduled: A specified parameter was not correct: fileType [ 1959.565687] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1959.566064] env[67893]: DEBUG nova.compute.manager [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1959.566242] env[67893]: DEBUG nova.compute.manager [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1959.566425] env[67893]: DEBUG nova.compute.manager [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1959.566586] env[67893]: DEBUG nova.network.neutron [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1960.018314] env[67893]: DEBUG nova.network.neutron [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1960.032633] env[67893]: INFO nova.compute.manager [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Took 0.47 seconds to deallocate network for instance. [ 1960.124399] env[67893]: INFO nova.scheduler.client.report [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Deleted allocations for instance 41b5c5ec-936a-4abe-9db7-38d0d2aa371d [ 1960.145703] env[67893]: DEBUG oslo_concurrency.lockutils [None req-eb01345b-ef7e-45a9-8360-fb00a0afdec8 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Lock "41b5c5ec-936a-4abe-9db7-38d0d2aa371d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 672.021s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1960.147031] env[67893]: DEBUG oslo_concurrency.lockutils [None req-339d37c8-dfe3-418f-874d-9642ef9b5005 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Lock "41b5c5ec-936a-4abe-9db7-38d0d2aa371d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 476.185s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1960.147175] env[67893]: DEBUG oslo_concurrency.lockutils [None req-339d37c8-dfe3-418f-874d-9642ef9b5005 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Acquiring lock "41b5c5ec-936a-4abe-9db7-38d0d2aa371d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1960.147437] env[67893]: DEBUG oslo_concurrency.lockutils [None req-339d37c8-dfe3-418f-874d-9642ef9b5005 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Lock "41b5c5ec-936a-4abe-9db7-38d0d2aa371d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1960.147621] env[67893]: DEBUG oslo_concurrency.lockutils [None req-339d37c8-dfe3-418f-874d-9642ef9b5005 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Lock "41b5c5ec-936a-4abe-9db7-38d0d2aa371d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1960.149570] env[67893]: INFO nova.compute.manager [None req-339d37c8-dfe3-418f-874d-9642ef9b5005 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Terminating instance [ 1960.151401] env[67893]: DEBUG nova.compute.manager [None req-339d37c8-dfe3-418f-874d-9642ef9b5005 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1960.151610] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-339d37c8-dfe3-418f-874d-9642ef9b5005 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1960.152265] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a3469ad1-7023-4604-a73c-245fd4b8f57b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1960.157892] env[67893]: DEBUG nova.compute.manager [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1960.164299] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df59a773-cde2-456f-af35-c4c4e14e88a8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1960.195037] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-339d37c8-dfe3-418f-874d-9642ef9b5005 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 41b5c5ec-936a-4abe-9db7-38d0d2aa371d could not be found. [ 1960.195037] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-339d37c8-dfe3-418f-874d-9642ef9b5005 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1960.195037] env[67893]: INFO nova.compute.manager [None req-339d37c8-dfe3-418f-874d-9642ef9b5005 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1960.195229] env[67893]: DEBUG oslo.service.loopingcall [None req-339d37c8-dfe3-418f-874d-9642ef9b5005 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1960.197362] env[67893]: DEBUG nova.compute.manager [-] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1960.197480] env[67893]: DEBUG nova.network.neutron [-] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1960.211265] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1960.211501] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1960.213453] env[67893]: INFO nova.compute.claims [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1960.222078] env[67893]: DEBUG nova.network.neutron [-] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1960.234465] env[67893]: INFO nova.compute.manager [-] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] Took 0.04 seconds to deallocate network for instance. [ 1960.348885] env[67893]: DEBUG oslo_concurrency.lockutils [None req-339d37c8-dfe3-418f-874d-9642ef9b5005 tempest-ListServersNegativeTestJSON-1846153094 tempest-ListServersNegativeTestJSON-1846153094-project-member] Lock "41b5c5ec-936a-4abe-9db7-38d0d2aa371d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.202s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1960.351018] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "41b5c5ec-936a-4abe-9db7-38d0d2aa371d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 155.979s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1960.351018] env[67893]: INFO nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 41b5c5ec-936a-4abe-9db7-38d0d2aa371d] During sync_power_state the instance has a pending task (deleting). Skip. [ 1960.351018] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "41b5c5ec-936a-4abe-9db7-38d0d2aa371d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1960.403936] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f5c94d6-5c8d-467a-8e1f-fe0d89c82457 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1960.411778] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0e082fb-4034-4d84-8034-613b57750ec0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1960.441203] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37400edf-4379-4d59-990e-8054cc6291ea {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1960.448196] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67309426-ccf4-48d3-a127-365dcc63a902 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1960.462165] env[67893]: DEBUG nova.compute.provider_tree [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1960.470384] env[67893]: DEBUG nova.scheduler.client.report [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1960.483771] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.272s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1960.484241] env[67893]: DEBUG nova.compute.manager [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1960.516072] env[67893]: DEBUG nova.compute.utils [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1960.517607] env[67893]: DEBUG nova.compute.manager [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1960.517779] env[67893]: DEBUG nova.network.neutron [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1960.527022] env[67893]: DEBUG nova.compute.manager [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1960.585125] env[67893]: DEBUG nova.policy [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '894285baafaf410ea301f676b78c45f4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b439a6039a714a6fabd3c0477629d3c1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 1960.592992] env[67893]: DEBUG nova.compute.manager [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1960.618353] env[67893]: DEBUG nova.virt.hardware [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1960.618591] env[67893]: DEBUG nova.virt.hardware [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1960.618747] env[67893]: DEBUG nova.virt.hardware [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1960.618925] env[67893]: DEBUG nova.virt.hardware [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1960.619107] env[67893]: DEBUG nova.virt.hardware [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1960.619275] env[67893]: DEBUG nova.virt.hardware [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1960.619486] env[67893]: DEBUG nova.virt.hardware [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1960.619646] env[67893]: DEBUG nova.virt.hardware [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1960.619810] env[67893]: DEBUG nova.virt.hardware [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1960.619972] env[67893]: DEBUG nova.virt.hardware [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1960.620159] env[67893]: DEBUG nova.virt.hardware [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1960.620990] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1d2a981-3abe-4046-80a3-c081827a7249 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1960.629497] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cba28390-65eb-4b61-b1f8-5879babaec2b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1960.874155] env[67893]: DEBUG nova.network.neutron [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Successfully created port: dcdc8c9c-47cb-4024-819d-2d59c3c04b4e {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1961.834313] env[67893]: DEBUG nova.network.neutron [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Successfully updated port: dcdc8c9c-47cb-4024-819d-2d59c3c04b4e {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1961.847623] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquiring lock "refresh_cache-72410dc2-74d9-4d59-bdd1-ad45b01c482b" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1961.847798] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquired lock "refresh_cache-72410dc2-74d9-4d59-bdd1-ad45b01c482b" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1961.847959] env[67893]: DEBUG nova.network.neutron [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1961.888727] env[67893]: DEBUG nova.network.neutron [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1962.038330] env[67893]: DEBUG nova.network.neutron [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Updating instance_info_cache with network_info: [{"id": "dcdc8c9c-47cb-4024-819d-2d59c3c04b4e", "address": "fa:16:3e:2d:10:21", "network": {"id": "3269c624-7a70-494c-85bc-8230ffbbab83", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-740576182-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b439a6039a714a6fabd3c0477629d3c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdcdc8c9c-47", "ovs_interfaceid": "dcdc8c9c-47cb-4024-819d-2d59c3c04b4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1962.048554] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Releasing lock "refresh_cache-72410dc2-74d9-4d59-bdd1-ad45b01c482b" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1962.048845] env[67893]: DEBUG nova.compute.manager [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Instance network_info: |[{"id": "dcdc8c9c-47cb-4024-819d-2d59c3c04b4e", "address": "fa:16:3e:2d:10:21", "network": {"id": "3269c624-7a70-494c-85bc-8230ffbbab83", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-740576182-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b439a6039a714a6fabd3c0477629d3c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdcdc8c9c-47", "ovs_interfaceid": "dcdc8c9c-47cb-4024-819d-2d59c3c04b4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1962.049357] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:2d:10:21', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'fa01fe1a-83b6-4c10-af75-00ddb17f9bbf', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'dcdc8c9c-47cb-4024-819d-2d59c3c04b4e', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1962.056774] env[67893]: DEBUG oslo.service.loopingcall [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1962.057997] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1962.059031] env[67893]: DEBUG nova.compute.manager [req-b23d6637-f64c-46f6-a60c-9d22a29609fb req-80a6e054-d6f5-4ef6-a96c-1a283cf1b154 service nova] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Received event network-vif-plugged-dcdc8c9c-47cb-4024-819d-2d59c3c04b4e {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1962.059621] env[67893]: DEBUG oslo_concurrency.lockutils [req-b23d6637-f64c-46f6-a60c-9d22a29609fb req-80a6e054-d6f5-4ef6-a96c-1a283cf1b154 service nova] Acquiring lock "72410dc2-74d9-4d59-bdd1-ad45b01c482b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1962.059837] env[67893]: DEBUG oslo_concurrency.lockutils [req-b23d6637-f64c-46f6-a60c-9d22a29609fb req-80a6e054-d6f5-4ef6-a96c-1a283cf1b154 service nova] Lock "72410dc2-74d9-4d59-bdd1-ad45b01c482b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1962.060015] env[67893]: DEBUG oslo_concurrency.lockutils [req-b23d6637-f64c-46f6-a60c-9d22a29609fb req-80a6e054-d6f5-4ef6-a96c-1a283cf1b154 service nova] Lock "72410dc2-74d9-4d59-bdd1-ad45b01c482b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1962.060194] env[67893]: DEBUG nova.compute.manager [req-b23d6637-f64c-46f6-a60c-9d22a29609fb req-80a6e054-d6f5-4ef6-a96c-1a283cf1b154 service nova] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] No waiting events found dispatching network-vif-plugged-dcdc8c9c-47cb-4024-819d-2d59c3c04b4e {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1962.060361] env[67893]: WARNING nova.compute.manager [req-b23d6637-f64c-46f6-a60c-9d22a29609fb req-80a6e054-d6f5-4ef6-a96c-1a283cf1b154 service nova] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Received unexpected event network-vif-plugged-dcdc8c9c-47cb-4024-819d-2d59c3c04b4e for instance with vm_state building and task_state spawning. [ 1962.060521] env[67893]: DEBUG nova.compute.manager [req-b23d6637-f64c-46f6-a60c-9d22a29609fb req-80a6e054-d6f5-4ef6-a96c-1a283cf1b154 service nova] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Received event network-changed-dcdc8c9c-47cb-4024-819d-2d59c3c04b4e {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1962.060675] env[67893]: DEBUG nova.compute.manager [req-b23d6637-f64c-46f6-a60c-9d22a29609fb req-80a6e054-d6f5-4ef6-a96c-1a283cf1b154 service nova] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Refreshing instance network info cache due to event network-changed-dcdc8c9c-47cb-4024-819d-2d59c3c04b4e. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1962.060859] env[67893]: DEBUG oslo_concurrency.lockutils [req-b23d6637-f64c-46f6-a60c-9d22a29609fb req-80a6e054-d6f5-4ef6-a96c-1a283cf1b154 service nova] Acquiring lock "refresh_cache-72410dc2-74d9-4d59-bdd1-ad45b01c482b" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1962.060998] env[67893]: DEBUG oslo_concurrency.lockutils [req-b23d6637-f64c-46f6-a60c-9d22a29609fb req-80a6e054-d6f5-4ef6-a96c-1a283cf1b154 service nova] Acquired lock "refresh_cache-72410dc2-74d9-4d59-bdd1-ad45b01c482b" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1962.061183] env[67893]: DEBUG nova.network.neutron [req-b23d6637-f64c-46f6-a60c-9d22a29609fb req-80a6e054-d6f5-4ef6-a96c-1a283cf1b154 service nova] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Refreshing network info cache for port dcdc8c9c-47cb-4024-819d-2d59c3c04b4e {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1962.062100] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-480cf5cb-5c77-43aa-b856-9e9189a67077 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1962.084723] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1962.084723] env[67893]: value = "task-3455489" [ 1962.084723] env[67893]: _type = "Task" [ 1962.084723] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1962.093075] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455489, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1962.399562] env[67893]: DEBUG nova.network.neutron [req-b23d6637-f64c-46f6-a60c-9d22a29609fb req-80a6e054-d6f5-4ef6-a96c-1a283cf1b154 service nova] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Updated VIF entry in instance network info cache for port dcdc8c9c-47cb-4024-819d-2d59c3c04b4e. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1962.399927] env[67893]: DEBUG nova.network.neutron [req-b23d6637-f64c-46f6-a60c-9d22a29609fb req-80a6e054-d6f5-4ef6-a96c-1a283cf1b154 service nova] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Updating instance_info_cache with network_info: [{"id": "dcdc8c9c-47cb-4024-819d-2d59c3c04b4e", "address": "fa:16:3e:2d:10:21", "network": {"id": "3269c624-7a70-494c-85bc-8230ffbbab83", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-740576182-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b439a6039a714a6fabd3c0477629d3c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdcdc8c9c-47", "ovs_interfaceid": "dcdc8c9c-47cb-4024-819d-2d59c3c04b4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1962.412022] env[67893]: DEBUG oslo_concurrency.lockutils [req-b23d6637-f64c-46f6-a60c-9d22a29609fb req-80a6e054-d6f5-4ef6-a96c-1a283cf1b154 service nova] Releasing lock "refresh_cache-72410dc2-74d9-4d59-bdd1-ad45b01c482b" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1962.595081] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455489, 'name': CreateVM_Task, 'duration_secs': 0.313824} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1962.595253] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1962.595903] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1962.596072] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1962.596421] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1962.596663] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1dda34f5-47b6-456a-85d9-6e01941162a6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1962.600826] env[67893]: DEBUG oslo_vmware.api [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Waiting for the task: (returnval){ [ 1962.600826] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]526ac69e-75ce-26e2-d5c0-47e4e76a30c7" [ 1962.600826] env[67893]: _type = "Task" [ 1962.600826] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1962.610397] env[67893]: DEBUG oslo_vmware.api [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]526ac69e-75ce-26e2-d5c0-47e4e76a30c7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1963.112054] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1963.112506] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1963.112506] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1971.140727] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dee9e879-9e3e-4c1a-818e-f071a6166f9d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "7169c720-f69e-40a3-95d2-473639884cd9" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1971.258197] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1974.854839] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1975.859064] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1975.859064] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1975.859064] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1975.881066] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1975.881066] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1975.881066] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1975.881066] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1975.881066] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1975.881066] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1975.881399] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1975.881399] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 11000d92-0094-4561-a807-ca76610ea549] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1975.881533] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1975.881670] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1975.881808] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1975.882340] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1976.859210] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1976.859536] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1976.859628] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1977.860846] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1978.859230] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1981.860337] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1981.872273] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1981.872692] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1981.873084] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1981.873416] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1981.874642] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b885d29b-0f7a-44ad-99ae-f56e3104e0f8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1981.883469] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfa2758c-adfa-4302-a47d-49e364e3e24a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1981.897341] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de2f29eb-3378-4f71-aec8-874f95b2fd32 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1981.903307] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f167424-b172-4b1f-ad3e-644be59fb07b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1981.934028] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180986MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1981.934028] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1981.934212] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1982.003588] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance e1849daf-3781-42ef-bede-267efbb652c9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1982.003762] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2875b0a3-0213-4908-b86b-ce45a8901553 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1982.003889] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance dfb92d1c-c2a5-49c1-8526-3743cb385c97 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1982.004027] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance ad60df35-54c0-459e-8a25-981922ae0a88 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1982.004158] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 9fc9f6b0-928e-46b4-ad7c-9217b2f31575 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1982.004276] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 15893b5f-a02a-4ce7-80c9-eea0658f9ac7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1982.004393] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 94760898-4f3c-4f41-85be-366f4108d0ba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1982.004512] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 11000d92-0094-4561-a807-ca76610ea549 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1982.004628] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 7169c720-f69e-40a3-95d2-473639884cd9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1982.004740] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 72410dc2-74d9-4d59-bdd1-ad45b01c482b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1982.015331] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a5151a22-4174-4f66-a83a-55a0dd01c407 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1982.015560] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1982.015707] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1982.133907] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1735f28e-6621-4fd3-b296-f86ec83b61ba {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1982.141006] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e8ab506-1119-45f9-8974-a8bf34e70f32 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1982.169560] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-412ea20e-b07e-4a74-8eba-f8578292939f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1982.175916] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9958b837-f4f2-4cd6-b8bf-803940ff0038 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1982.188223] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1982.196255] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1982.209348] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1982.209530] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.275s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2008.164998] env[67893]: WARNING oslo_vmware.rw_handles [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2008.164998] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2008.164998] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2008.164998] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2008.164998] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2008.164998] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 2008.164998] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2008.164998] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2008.164998] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2008.164998] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2008.164998] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2008.164998] env[67893]: ERROR oslo_vmware.rw_handles [ 2008.165729] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/4344c1af-1de7-4a9c-aa0c-98a3cf4b4469/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2008.167777] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2008.168055] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Copying Virtual Disk [datastore1] vmware_temp/4344c1af-1de7-4a9c-aa0c-98a3cf4b4469/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/4344c1af-1de7-4a9c-aa0c-98a3cf4b4469/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2008.168370] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-2b222074-dc0d-470e-a22d-87ad9fc678d4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2008.176223] env[67893]: DEBUG oslo_vmware.api [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Waiting for the task: (returnval){ [ 2008.176223] env[67893]: value = "task-3455490" [ 2008.176223] env[67893]: _type = "Task" [ 2008.176223] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2008.183903] env[67893]: DEBUG oslo_vmware.api [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Task: {'id': task-3455490, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2008.687054] env[67893]: DEBUG oslo_vmware.exceptions [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2008.687353] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2008.687920] env[67893]: ERROR nova.compute.manager [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2008.687920] env[67893]: Faults: ['InvalidArgument'] [ 2008.687920] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] Traceback (most recent call last): [ 2008.687920] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2008.687920] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] yield resources [ 2008.687920] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2008.687920] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] self.driver.spawn(context, instance, image_meta, [ 2008.687920] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2008.687920] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2008.687920] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2008.687920] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] self._fetch_image_if_missing(context, vi) [ 2008.687920] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2008.687920] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] image_cache(vi, tmp_image_ds_loc) [ 2008.687920] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2008.687920] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] vm_util.copy_virtual_disk( [ 2008.687920] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2008.687920] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] session._wait_for_task(vmdk_copy_task) [ 2008.687920] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2008.687920] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] return self.wait_for_task(task_ref) [ 2008.687920] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2008.687920] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] return evt.wait() [ 2008.687920] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2008.687920] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] result = hub.switch() [ 2008.687920] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2008.687920] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] return self.greenlet.switch() [ 2008.687920] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2008.687920] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] self.f(*self.args, **self.kw) [ 2008.687920] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2008.687920] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] raise exceptions.translate_fault(task_info.error) [ 2008.687920] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2008.687920] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] Faults: ['InvalidArgument'] [ 2008.687920] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] [ 2008.689097] env[67893]: INFO nova.compute.manager [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Terminating instance [ 2008.690679] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2008.690889] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2008.691577] env[67893]: DEBUG nova.compute.manager [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2008.691777] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2008.692016] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fed2456a-9014-4ed1-aeec-c5e305fe66b2 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2008.694231] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-992e020f-4039-401e-996d-8a8f24421f8a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2008.700723] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2008.700930] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-15c85e1f-b6e4-4c8c-bc07-9ace631a8819 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2008.703395] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2008.703566] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2008.704220] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0c3d92f9-5ee4-48ba-9956-5022218999b8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2008.708646] env[67893]: DEBUG oslo_vmware.api [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Waiting for the task: (returnval){ [ 2008.708646] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]529d20f0-a125-eda0-698a-1a518c906186" [ 2008.708646] env[67893]: _type = "Task" [ 2008.708646] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2008.715774] env[67893]: DEBUG oslo_vmware.api [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]529d20f0-a125-eda0-698a-1a518c906186, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2009.218831] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2009.220011] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Creating directory with path [datastore1] vmware_temp/e1f6d7b8-1892-418d-8d0f-801fae02db6f/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2009.220011] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7e70839d-59e9-4089-b082-1f5138c34d8a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2009.239223] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Created directory with path [datastore1] vmware_temp/e1f6d7b8-1892-418d-8d0f-801fae02db6f/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2009.239436] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Fetch image to [datastore1] vmware_temp/e1f6d7b8-1892-418d-8d0f-801fae02db6f/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2009.239627] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/e1f6d7b8-1892-418d-8d0f-801fae02db6f/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2009.240365] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ecce872-02d6-4112-9433-9d074d95dfb1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2009.247138] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b1908c8-01f1-4f83-8ff3-1862d243cd28 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2009.255884] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b8a6b63-a149-4df5-ac4a-8ce8b62c70f3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2009.287547] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92da71e1-c1a5-485b-afd8-b651c84e3161 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2009.292784] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4e7df946-113b-4644-8545-36c44a44b96b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2009.382394] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2009.432659] env[67893]: DEBUG oslo_vmware.rw_handles [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e1f6d7b8-1892-418d-8d0f-801fae02db6f/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2009.492780] env[67893]: DEBUG oslo_vmware.rw_handles [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2009.492974] env[67893]: DEBUG oslo_vmware.rw_handles [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e1f6d7b8-1892-418d-8d0f-801fae02db6f/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2009.763181] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2009.763368] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2009.763526] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Deleting the datastore file [datastore1] e1849daf-3781-42ef-bede-267efbb652c9 {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2009.763824] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f2e34287-e79d-4f2e-9bf7-469ecac3db7c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2009.769734] env[67893]: DEBUG oslo_vmware.api [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Waiting for the task: (returnval){ [ 2009.769734] env[67893]: value = "task-3455492" [ 2009.769734] env[67893]: _type = "Task" [ 2009.769734] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2009.777087] env[67893]: DEBUG oslo_vmware.api [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Task: {'id': task-3455492, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2010.279344] env[67893]: DEBUG oslo_vmware.api [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Task: {'id': task-3455492, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.06635} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2010.279648] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2010.279775] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2010.279946] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2010.280137] env[67893]: INFO nova.compute.manager [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Took 1.59 seconds to destroy the instance on the hypervisor. [ 2010.283063] env[67893]: DEBUG nova.compute.claims [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2010.283243] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2010.283457] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2010.445738] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a8822c3-c1a0-405e-b02a-4057d5abb9b0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2010.453279] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c32eb05-bff9-42e9-b128-c2c9b842918e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2010.482900] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e196f3f9-6618-4fd0-9878-4c43b4dff11f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2010.489055] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9f12909-daaf-4aaf-b5a4-537afde546c8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2010.504256] env[67893]: DEBUG nova.compute.provider_tree [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2010.512810] env[67893]: DEBUG nova.scheduler.client.report [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2010.527263] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.244s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2010.527783] env[67893]: ERROR nova.compute.manager [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2010.527783] env[67893]: Faults: ['InvalidArgument'] [ 2010.527783] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] Traceback (most recent call last): [ 2010.527783] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2010.527783] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] self.driver.spawn(context, instance, image_meta, [ 2010.527783] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2010.527783] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2010.527783] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2010.527783] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] self._fetch_image_if_missing(context, vi) [ 2010.527783] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2010.527783] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] image_cache(vi, tmp_image_ds_loc) [ 2010.527783] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2010.527783] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] vm_util.copy_virtual_disk( [ 2010.527783] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2010.527783] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] session._wait_for_task(vmdk_copy_task) [ 2010.527783] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2010.527783] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] return self.wait_for_task(task_ref) [ 2010.527783] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2010.527783] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] return evt.wait() [ 2010.527783] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2010.527783] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] result = hub.switch() [ 2010.527783] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2010.527783] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] return self.greenlet.switch() [ 2010.527783] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2010.527783] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] self.f(*self.args, **self.kw) [ 2010.527783] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2010.527783] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] raise exceptions.translate_fault(task_info.error) [ 2010.527783] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2010.527783] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] Faults: ['InvalidArgument'] [ 2010.527783] env[67893]: ERROR nova.compute.manager [instance: e1849daf-3781-42ef-bede-267efbb652c9] [ 2010.528636] env[67893]: DEBUG nova.compute.utils [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2010.529810] env[67893]: DEBUG nova.compute.manager [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Build of instance e1849daf-3781-42ef-bede-267efbb652c9 was re-scheduled: A specified parameter was not correct: fileType [ 2010.529810] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2010.530198] env[67893]: DEBUG nova.compute.manager [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2010.530365] env[67893]: DEBUG nova.compute.manager [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2010.530539] env[67893]: DEBUG nova.compute.manager [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2010.530699] env[67893]: DEBUG nova.network.neutron [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2010.793993] env[67893]: DEBUG nova.network.neutron [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2010.806144] env[67893]: INFO nova.compute.manager [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Took 0.28 seconds to deallocate network for instance. [ 2010.907899] env[67893]: INFO nova.scheduler.client.report [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Deleted allocations for instance e1849daf-3781-42ef-bede-267efbb652c9 [ 2010.928467] env[67893]: DEBUG oslo_concurrency.lockutils [None req-bcda8e4b-dc17-433b-b2f7-f8d7b0f40a6b tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "e1849daf-3781-42ef-bede-267efbb652c9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 628.989s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2010.929734] env[67893]: DEBUG oslo_concurrency.lockutils [None req-269d84f0-8244-42b8-8300-227cbcea2b2a tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "e1849daf-3781-42ef-bede-267efbb652c9" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 432.283s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2010.929963] env[67893]: DEBUG oslo_concurrency.lockutils [None req-269d84f0-8244-42b8-8300-227cbcea2b2a tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "e1849daf-3781-42ef-bede-267efbb652c9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2010.930184] env[67893]: DEBUG oslo_concurrency.lockutils [None req-269d84f0-8244-42b8-8300-227cbcea2b2a tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "e1849daf-3781-42ef-bede-267efbb652c9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2010.930350] env[67893]: DEBUG oslo_concurrency.lockutils [None req-269d84f0-8244-42b8-8300-227cbcea2b2a tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "e1849daf-3781-42ef-bede-267efbb652c9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2010.932404] env[67893]: INFO nova.compute.manager [None req-269d84f0-8244-42b8-8300-227cbcea2b2a tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Terminating instance [ 2010.934129] env[67893]: DEBUG nova.compute.manager [None req-269d84f0-8244-42b8-8300-227cbcea2b2a tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2010.934328] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-269d84f0-8244-42b8-8300-227cbcea2b2a tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2010.934794] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-cd537c5d-720f-4649-9f11-56ab469b9b1b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2010.944664] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-933c91ac-f5a3-4cdb-a344-7632801614af {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2010.955153] env[67893]: DEBUG nova.compute.manager [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2010.975218] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-269d84f0-8244-42b8-8300-227cbcea2b2a tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e1849daf-3781-42ef-bede-267efbb652c9 could not be found. [ 2010.975426] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-269d84f0-8244-42b8-8300-227cbcea2b2a tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2010.975649] env[67893]: INFO nova.compute.manager [None req-269d84f0-8244-42b8-8300-227cbcea2b2a tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2010.975966] env[67893]: DEBUG oslo.service.loopingcall [None req-269d84f0-8244-42b8-8300-227cbcea2b2a tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2010.976211] env[67893]: DEBUG nova.compute.manager [-] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2010.976306] env[67893]: DEBUG nova.network.neutron [-] [instance: e1849daf-3781-42ef-bede-267efbb652c9] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2010.998555] env[67893]: DEBUG nova.network.neutron [-] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2011.002350] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2011.002614] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2011.003956] env[67893]: INFO nova.compute.claims [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2011.007872] env[67893]: INFO nova.compute.manager [-] [instance: e1849daf-3781-42ef-bede-267efbb652c9] Took 0.03 seconds to deallocate network for instance. [ 2011.097356] env[67893]: DEBUG oslo_concurrency.lockutils [None req-269d84f0-8244-42b8-8300-227cbcea2b2a tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "e1849daf-3781-42ef-bede-267efbb652c9" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.168s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2011.098235] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "e1849daf-3781-42ef-bede-267efbb652c9" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 206.727s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2011.098393] env[67893]: INFO nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: e1849daf-3781-42ef-bede-267efbb652c9] During sync_power_state the instance has a pending task (deleting). Skip. [ 2011.098565] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "e1849daf-3781-42ef-bede-267efbb652c9" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2011.174370] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af357f52-aa8d-4857-8053-d7abf67af18e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2011.182841] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8445e2f-cadb-470c-99a3-db6539aa5508 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2011.216262] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bee318ff-fef9-4617-85b3-53db06e8b985 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2011.223750] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b047141d-e96d-4ef1-9960-007b848d643d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2011.237559] env[67893]: DEBUG nova.compute.provider_tree [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2011.246509] env[67893]: DEBUG nova.scheduler.client.report [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2011.259850] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.257s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2011.260378] env[67893]: DEBUG nova.compute.manager [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2011.292598] env[67893]: DEBUG nova.compute.utils [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2011.294696] env[67893]: DEBUG nova.compute.manager [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2011.294904] env[67893]: DEBUG nova.network.neutron [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2011.304937] env[67893]: DEBUG nova.compute.manager [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2011.358814] env[67893]: DEBUG nova.policy [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '71de4983c0904523a76f06159acb1780', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f40d421edc4b42518a78163ad38309f0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 2011.370371] env[67893]: DEBUG nova.compute.manager [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2011.394914] env[67893]: DEBUG nova.virt.hardware [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2011.395186] env[67893]: DEBUG nova.virt.hardware [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2011.395374] env[67893]: DEBUG nova.virt.hardware [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2011.395591] env[67893]: DEBUG nova.virt.hardware [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2011.395747] env[67893]: DEBUG nova.virt.hardware [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2011.395897] env[67893]: DEBUG nova.virt.hardware [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2011.396125] env[67893]: DEBUG nova.virt.hardware [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2011.396294] env[67893]: DEBUG nova.virt.hardware [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2011.396462] env[67893]: DEBUG nova.virt.hardware [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2011.396624] env[67893]: DEBUG nova.virt.hardware [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2011.396798] env[67893]: DEBUG nova.virt.hardware [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2011.397659] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d809549-8c63-4936-a16e-402829a2d9ea {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2011.406193] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5aabcd23-848a-4d2d-ac7e-5210131dded0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2011.726350] env[67893]: DEBUG nova.network.neutron [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Successfully created port: 18ecd52c-e8c5-4350-8c02-4180239a84f0 {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2012.276709] env[67893]: DEBUG nova.network.neutron [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Successfully updated port: 18ecd52c-e8c5-4350-8c02-4180239a84f0 {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2012.290251] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Acquiring lock "refresh_cache-a5151a22-4174-4f66-a83a-55a0dd01c407" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2012.290402] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Acquired lock "refresh_cache-a5151a22-4174-4f66-a83a-55a0dd01c407" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2012.290580] env[67893]: DEBUG nova.network.neutron [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2012.339373] env[67893]: DEBUG nova.network.neutron [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2012.750316] env[67893]: DEBUG nova.network.neutron [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Updating instance_info_cache with network_info: [{"id": "18ecd52c-e8c5-4350-8c02-4180239a84f0", "address": "fa:16:3e:11:f6:97", "network": {"id": "93089f42-d881-4116-b07a-dbf0f442244d", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1610683550-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f40d421edc4b42518a78163ad38309f0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b5215e5b-294b-4e8c-bd06-355e9955ab1d", "external-id": "nsx-vlan-transportzone-529", "segmentation_id": 529, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap18ecd52c-e8", "ovs_interfaceid": "18ecd52c-e8c5-4350-8c02-4180239a84f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2012.766230] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Releasing lock "refresh_cache-a5151a22-4174-4f66-a83a-55a0dd01c407" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2012.766538] env[67893]: DEBUG nova.compute.manager [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Instance network_info: |[{"id": "18ecd52c-e8c5-4350-8c02-4180239a84f0", "address": "fa:16:3e:11:f6:97", "network": {"id": "93089f42-d881-4116-b07a-dbf0f442244d", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1610683550-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f40d421edc4b42518a78163ad38309f0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b5215e5b-294b-4e8c-bd06-355e9955ab1d", "external-id": "nsx-vlan-transportzone-529", "segmentation_id": 529, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap18ecd52c-e8", "ovs_interfaceid": "18ecd52c-e8c5-4350-8c02-4180239a84f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2012.766938] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:11:f6:97', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'b5215e5b-294b-4e8c-bd06-355e9955ab1d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '18ecd52c-e8c5-4350-8c02-4180239a84f0', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2012.774424] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Creating folder: Project (f40d421edc4b42518a78163ad38309f0). Parent ref: group-v689771. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2012.775009] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9da1c66b-8806-458f-be20-ffef654bec89 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2012.786204] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Created folder: Project (f40d421edc4b42518a78163ad38309f0) in parent group-v689771. [ 2012.786385] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Creating folder: Instances. Parent ref: group-v689878. {{(pid=67893) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2012.786610] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f6432397-ea27-4c93-8bf7-4a30dd77a008 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2012.795212] env[67893]: INFO nova.virt.vmwareapi.vm_util [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Created folder: Instances in parent group-v689878. [ 2012.795436] env[67893]: DEBUG oslo.service.loopingcall [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2012.795662] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2012.795896] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c32ae643-4601-4033-b6da-cb116d5468d7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2012.814434] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2012.814434] env[67893]: value = "task-3455495" [ 2012.814434] env[67893]: _type = "Task" [ 2012.814434] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2012.821716] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455495, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2012.893290] env[67893]: DEBUG nova.compute.manager [req-bbfe0efb-83f0-416c-811c-582d22740811 req-c6211e16-6faa-4ca2-930b-5c9bbb36c3a0 service nova] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Received event network-vif-plugged-18ecd52c-e8c5-4350-8c02-4180239a84f0 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2012.893540] env[67893]: DEBUG oslo_concurrency.lockutils [req-bbfe0efb-83f0-416c-811c-582d22740811 req-c6211e16-6faa-4ca2-930b-5c9bbb36c3a0 service nova] Acquiring lock "a5151a22-4174-4f66-a83a-55a0dd01c407-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2012.893730] env[67893]: DEBUG oslo_concurrency.lockutils [req-bbfe0efb-83f0-416c-811c-582d22740811 req-c6211e16-6faa-4ca2-930b-5c9bbb36c3a0 service nova] Lock "a5151a22-4174-4f66-a83a-55a0dd01c407-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2012.893865] env[67893]: DEBUG oslo_concurrency.lockutils [req-bbfe0efb-83f0-416c-811c-582d22740811 req-c6211e16-6faa-4ca2-930b-5c9bbb36c3a0 service nova] Lock "a5151a22-4174-4f66-a83a-55a0dd01c407-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2012.894050] env[67893]: DEBUG nova.compute.manager [req-bbfe0efb-83f0-416c-811c-582d22740811 req-c6211e16-6faa-4ca2-930b-5c9bbb36c3a0 service nova] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] No waiting events found dispatching network-vif-plugged-18ecd52c-e8c5-4350-8c02-4180239a84f0 {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2012.894226] env[67893]: WARNING nova.compute.manager [req-bbfe0efb-83f0-416c-811c-582d22740811 req-c6211e16-6faa-4ca2-930b-5c9bbb36c3a0 service nova] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Received unexpected event network-vif-plugged-18ecd52c-e8c5-4350-8c02-4180239a84f0 for instance with vm_state building and task_state spawning. [ 2012.894386] env[67893]: DEBUG nova.compute.manager [req-bbfe0efb-83f0-416c-811c-582d22740811 req-c6211e16-6faa-4ca2-930b-5c9bbb36c3a0 service nova] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Received event network-changed-18ecd52c-e8c5-4350-8c02-4180239a84f0 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2012.894595] env[67893]: DEBUG nova.compute.manager [req-bbfe0efb-83f0-416c-811c-582d22740811 req-c6211e16-6faa-4ca2-930b-5c9bbb36c3a0 service nova] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Refreshing instance network info cache due to event network-changed-18ecd52c-e8c5-4350-8c02-4180239a84f0. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2012.894811] env[67893]: DEBUG oslo_concurrency.lockutils [req-bbfe0efb-83f0-416c-811c-582d22740811 req-c6211e16-6faa-4ca2-930b-5c9bbb36c3a0 service nova] Acquiring lock "refresh_cache-a5151a22-4174-4f66-a83a-55a0dd01c407" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2012.894949] env[67893]: DEBUG oslo_concurrency.lockutils [req-bbfe0efb-83f0-416c-811c-582d22740811 req-c6211e16-6faa-4ca2-930b-5c9bbb36c3a0 service nova] Acquired lock "refresh_cache-a5151a22-4174-4f66-a83a-55a0dd01c407" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2012.895140] env[67893]: DEBUG nova.network.neutron [req-bbfe0efb-83f0-416c-811c-582d22740811 req-c6211e16-6faa-4ca2-930b-5c9bbb36c3a0 service nova] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Refreshing network info cache for port 18ecd52c-e8c5-4350-8c02-4180239a84f0 {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2013.143827] env[67893]: DEBUG nova.network.neutron [req-bbfe0efb-83f0-416c-811c-582d22740811 req-c6211e16-6faa-4ca2-930b-5c9bbb36c3a0 service nova] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Updated VIF entry in instance network info cache for port 18ecd52c-e8c5-4350-8c02-4180239a84f0. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2013.144264] env[67893]: DEBUG nova.network.neutron [req-bbfe0efb-83f0-416c-811c-582d22740811 req-c6211e16-6faa-4ca2-930b-5c9bbb36c3a0 service nova] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Updating instance_info_cache with network_info: [{"id": "18ecd52c-e8c5-4350-8c02-4180239a84f0", "address": "fa:16:3e:11:f6:97", "network": {"id": "93089f42-d881-4116-b07a-dbf0f442244d", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1610683550-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f40d421edc4b42518a78163ad38309f0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b5215e5b-294b-4e8c-bd06-355e9955ab1d", "external-id": "nsx-vlan-transportzone-529", "segmentation_id": 529, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap18ecd52c-e8", "ovs_interfaceid": "18ecd52c-e8c5-4350-8c02-4180239a84f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2013.154274] env[67893]: DEBUG oslo_concurrency.lockutils [req-bbfe0efb-83f0-416c-811c-582d22740811 req-c6211e16-6faa-4ca2-930b-5c9bbb36c3a0 service nova] Releasing lock "refresh_cache-a5151a22-4174-4f66-a83a-55a0dd01c407" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2013.324057] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455495, 'name': CreateVM_Task, 'duration_secs': 0.316864} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2013.324262] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2013.324933] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2013.325116] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2013.325442] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2013.325696] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-87758610-9ced-44e0-8234-8798a7eb7e73 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2013.330122] env[67893]: DEBUG oslo_vmware.api [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Waiting for the task: (returnval){ [ 2013.330122] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52d59351-dc12-3836-cbc6-cf5aa6d55bc4" [ 2013.330122] env[67893]: _type = "Task" [ 2013.330122] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2013.337610] env[67893]: DEBUG oslo_vmware.api [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52d59351-dc12-3836-cbc6-cf5aa6d55bc4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2013.840220] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2013.840628] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2013.840707] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2027.332889] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dfdbc632-6bd6-4c5e-af06-f77846119324 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquiring lock "72410dc2-74d9-4d59-bdd1-ad45b01c482b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2033.209889] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2035.853760] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2035.858379] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2036.858562] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2036.858851] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2036.858890] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2036.881513] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2036.881740] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2036.881933] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2036.882130] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2036.882308] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2036.882477] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2036.882647] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 11000d92-0094-4561-a807-ca76610ea549] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2036.882817] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2036.882988] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2036.883173] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2036.883344] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2036.883996] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2036.884272] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2037.859874] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2039.854967] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2039.876772] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2039.876970] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2043.859944] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2043.871012] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2043.871242] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2043.871411] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2043.871569] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2043.872827] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb0d3a4a-21d2-4469-8963-9d92943d4f84 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2043.881807] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b351a8d5-b786-45a7-848f-d89d6ba55a0b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2043.895578] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-542d67c7-4621-42ec-a513-46ce8d9ebe5b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2043.901681] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2d25d45-58fe-4318-8fea-0ce390467913 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2043.929572] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180985MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2043.929750] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2043.929950] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2044.001514] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 2875b0a3-0213-4908-b86b-ce45a8901553 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.001684] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance dfb92d1c-c2a5-49c1-8526-3743cb385c97 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.001838] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance ad60df35-54c0-459e-8a25-981922ae0a88 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.001970] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 9fc9f6b0-928e-46b4-ad7c-9217b2f31575 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.002109] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 15893b5f-a02a-4ce7-80c9-eea0658f9ac7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.002230] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 94760898-4f3c-4f41-85be-366f4108d0ba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.002345] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 11000d92-0094-4561-a807-ca76610ea549 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.002461] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 7169c720-f69e-40a3-95d2-473639884cd9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.002574] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 72410dc2-74d9-4d59-bdd1-ad45b01c482b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.002685] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a5151a22-4174-4f66-a83a-55a0dd01c407 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.002898] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2044.003062] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2044.116756] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2507caa3-9059-43a1-9509-32a7bbbec040 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2044.124262] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89cf1d09-67fb-4b15-85f4-86b6b6ac580f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2044.152796] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e11d033-a20c-4a40-9574-323fba64c546 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2044.160079] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffc2cd80-5aab-4b57-b601-620a6cef0751 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2044.171879] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2044.180846] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2044.196777] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2044.197068] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.267s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2058.620521] env[67893]: WARNING oslo_vmware.rw_handles [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2058.620521] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2058.620521] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2058.620521] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2058.620521] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2058.620521] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 2058.620521] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2058.620521] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2058.620521] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2058.620521] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2058.620521] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2058.620521] env[67893]: ERROR oslo_vmware.rw_handles [ 2058.621208] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/e1f6d7b8-1892-418d-8d0f-801fae02db6f/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2058.624216] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2058.624456] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Copying Virtual Disk [datastore1] vmware_temp/e1f6d7b8-1892-418d-8d0f-801fae02db6f/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/e1f6d7b8-1892-418d-8d0f-801fae02db6f/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2058.624755] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9c592d5f-06a3-4106-99f2-b3e61c3eee8a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2058.632463] env[67893]: DEBUG oslo_vmware.api [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Waiting for the task: (returnval){ [ 2058.632463] env[67893]: value = "task-3455496" [ 2058.632463] env[67893]: _type = "Task" [ 2058.632463] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2058.640380] env[67893]: DEBUG oslo_vmware.api [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Task: {'id': task-3455496, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2059.143143] env[67893]: DEBUG oslo_vmware.exceptions [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2059.143447] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2059.144057] env[67893]: ERROR nova.compute.manager [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2059.144057] env[67893]: Faults: ['InvalidArgument'] [ 2059.144057] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Traceback (most recent call last): [ 2059.144057] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2059.144057] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] yield resources [ 2059.144057] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2059.144057] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] self.driver.spawn(context, instance, image_meta, [ 2059.144057] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2059.144057] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2059.144057] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2059.144057] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] self._fetch_image_if_missing(context, vi) [ 2059.144057] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2059.144057] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] image_cache(vi, tmp_image_ds_loc) [ 2059.144057] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2059.144057] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] vm_util.copy_virtual_disk( [ 2059.144057] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2059.144057] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] session._wait_for_task(vmdk_copy_task) [ 2059.144057] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2059.144057] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] return self.wait_for_task(task_ref) [ 2059.144057] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2059.144057] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] return evt.wait() [ 2059.144057] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2059.144057] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] result = hub.switch() [ 2059.144057] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2059.144057] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] return self.greenlet.switch() [ 2059.144057] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2059.144057] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] self.f(*self.args, **self.kw) [ 2059.144057] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2059.144057] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] raise exceptions.translate_fault(task_info.error) [ 2059.144057] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2059.144057] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Faults: ['InvalidArgument'] [ 2059.144057] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] [ 2059.145303] env[67893]: INFO nova.compute.manager [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Terminating instance [ 2059.145966] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2059.146190] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2059.146434] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-49cf7fc1-6715-4856-9714-88e1cb34b7f0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2059.149860] env[67893]: DEBUG nova.compute.manager [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2059.150081] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2059.150791] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3a99d33-475c-469e-beed-7769fb486f4b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2059.157240] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2059.157440] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-12f5c33d-803b-415b-b279-a76e6b2e3802 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2059.159502] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2059.159674] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2059.160624] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-102b5c76-8938-416d-915b-fffadeed5ec3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2059.165138] env[67893]: DEBUG oslo_vmware.api [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Waiting for the task: (returnval){ [ 2059.165138] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52c0dcb6-5445-92eb-2c14-b79a2cf15f6b" [ 2059.165138] env[67893]: _type = "Task" [ 2059.165138] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2059.171679] env[67893]: DEBUG oslo_vmware.api [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52c0dcb6-5445-92eb-2c14-b79a2cf15f6b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2059.220750] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2059.220918] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2059.221279] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Deleting the datastore file [datastore1] 2875b0a3-0213-4908-b86b-ce45a8901553 {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2059.221371] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-330998cc-c414-4852-ae95-fc1de6cdb336 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2059.227080] env[67893]: DEBUG oslo_vmware.api [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Waiting for the task: (returnval){ [ 2059.227080] env[67893]: value = "task-3455498" [ 2059.227080] env[67893]: _type = "Task" [ 2059.227080] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2059.234802] env[67893]: DEBUG oslo_vmware.api [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Task: {'id': task-3455498, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2059.675762] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2059.676187] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Creating directory with path [datastore1] vmware_temp/4b83264d-4dc7-493e-adf6-7d785bb48a11/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2059.676485] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e80b4934-6801-4f70-8e7c-066f366da7b3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2059.687513] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Created directory with path [datastore1] vmware_temp/4b83264d-4dc7-493e-adf6-7d785bb48a11/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2059.687764] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Fetch image to [datastore1] vmware_temp/4b83264d-4dc7-493e-adf6-7d785bb48a11/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2059.688024] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/4b83264d-4dc7-493e-adf6-7d785bb48a11/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2059.688803] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fffbd3eb-6c74-4ab0-baff-ade057549d1c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2059.695412] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d0b8459-91db-4d12-abcf-9b57f6345c32 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2059.704433] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7cbf69d0-544c-49b3-ad74-3a1ff9521171 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2059.738041] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8734fb2e-6243-470a-a752-2c42419996a5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2059.746190] env[67893]: DEBUG oslo_vmware.api [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Task: {'id': task-3455498, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07352} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2059.746751] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2059.746934] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2059.747122] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2059.747297] env[67893]: INFO nova.compute.manager [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2059.748750] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0448c3f2-0aec-4839-91fc-3850ac083333 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2059.751185] env[67893]: DEBUG nova.compute.claims [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2059.751364] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2059.751586] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2059.773060] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2059.924959] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48937428-3516-4ef0-9560-6dd51e713a4b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2059.938297] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-adefd28e-8d3e-41e1-b227-6a3ef0c5f98d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2059.969039] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6914ed21-c81b-4d22-8395-8426fec5de2d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2059.975799] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ca8484e-a20f-4e07-a3a4-3d6a593dbf1a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2059.988391] env[67893]: DEBUG nova.compute.provider_tree [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2059.990545] env[67893]: DEBUG oslo_vmware.rw_handles [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4b83264d-4dc7-493e-adf6-7d785bb48a11/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2060.048767] env[67893]: DEBUG nova.scheduler.client.report [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2060.052973] env[67893]: DEBUG oslo_vmware.rw_handles [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2060.052973] env[67893]: DEBUG oslo_vmware.rw_handles [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4b83264d-4dc7-493e-adf6-7d785bb48a11/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2060.064526] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.312s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2060.064526] env[67893]: ERROR nova.compute.manager [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2060.064526] env[67893]: Faults: ['InvalidArgument'] [ 2060.064526] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Traceback (most recent call last): [ 2060.064526] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2060.064526] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] self.driver.spawn(context, instance, image_meta, [ 2060.064526] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2060.064526] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2060.064526] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2060.064526] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] self._fetch_image_if_missing(context, vi) [ 2060.064526] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2060.064526] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] image_cache(vi, tmp_image_ds_loc) [ 2060.064526] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2060.064526] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] vm_util.copy_virtual_disk( [ 2060.064526] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2060.064526] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] session._wait_for_task(vmdk_copy_task) [ 2060.064526] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2060.064526] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] return self.wait_for_task(task_ref) [ 2060.064526] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2060.064526] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] return evt.wait() [ 2060.064526] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2060.064526] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] result = hub.switch() [ 2060.064526] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2060.064526] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] return self.greenlet.switch() [ 2060.064526] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2060.064526] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] self.f(*self.args, **self.kw) [ 2060.064526] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2060.064526] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] raise exceptions.translate_fault(task_info.error) [ 2060.064526] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2060.064526] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Faults: ['InvalidArgument'] [ 2060.064526] env[67893]: ERROR nova.compute.manager [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] [ 2060.065512] env[67893]: DEBUG nova.compute.utils [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2060.066754] env[67893]: DEBUG nova.compute.manager [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Build of instance 2875b0a3-0213-4908-b86b-ce45a8901553 was re-scheduled: A specified parameter was not correct: fileType [ 2060.066754] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2060.066925] env[67893]: DEBUG nova.compute.manager [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2060.067082] env[67893]: DEBUG nova.compute.manager [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2060.067264] env[67893]: DEBUG nova.compute.manager [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2060.067438] env[67893]: DEBUG nova.network.neutron [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2060.336756] env[67893]: DEBUG nova.network.neutron [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2060.347870] env[67893]: INFO nova.compute.manager [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Took 0.28 seconds to deallocate network for instance. [ 2060.439468] env[67893]: INFO nova.scheduler.client.report [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Deleted allocations for instance 2875b0a3-0213-4908-b86b-ce45a8901553 [ 2060.467712] env[67893]: DEBUG oslo_concurrency.lockutils [None req-c3a09bb2-6d2a-4600-b1a2-ab6a700ba229 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "2875b0a3-0213-4908-b86b-ce45a8901553" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 623.356s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2060.467994] env[67893]: DEBUG oslo_concurrency.lockutils [None req-fd9152e3-d398-4d70-b0a6-7a3f372f9552 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "2875b0a3-0213-4908-b86b-ce45a8901553" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 426.837s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2060.468243] env[67893]: DEBUG oslo_concurrency.lockutils [None req-fd9152e3-d398-4d70-b0a6-7a3f372f9552 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquiring lock "2875b0a3-0213-4908-b86b-ce45a8901553-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2060.468451] env[67893]: DEBUG oslo_concurrency.lockutils [None req-fd9152e3-d398-4d70-b0a6-7a3f372f9552 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "2875b0a3-0213-4908-b86b-ce45a8901553-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2060.468619] env[67893]: DEBUG oslo_concurrency.lockutils [None req-fd9152e3-d398-4d70-b0a6-7a3f372f9552 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "2875b0a3-0213-4908-b86b-ce45a8901553-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2060.472610] env[67893]: INFO nova.compute.manager [None req-fd9152e3-d398-4d70-b0a6-7a3f372f9552 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Terminating instance [ 2060.474964] env[67893]: DEBUG nova.compute.manager [None req-fd9152e3-d398-4d70-b0a6-7a3f372f9552 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2060.475060] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-fd9152e3-d398-4d70-b0a6-7a3f372f9552 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2060.476037] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-72593671-70fa-4e7d-93ec-bddf6e40b01e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2060.484871] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2288f05-2b9e-4ee5-8229-7e49c27903ee {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2060.513769] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-fd9152e3-d398-4d70-b0a6-7a3f372f9552 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 2875b0a3-0213-4908-b86b-ce45a8901553 could not be found. [ 2060.513978] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-fd9152e3-d398-4d70-b0a6-7a3f372f9552 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2060.514175] env[67893]: INFO nova.compute.manager [None req-fd9152e3-d398-4d70-b0a6-7a3f372f9552 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2060.514413] env[67893]: DEBUG oslo.service.loopingcall [None req-fd9152e3-d398-4d70-b0a6-7a3f372f9552 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2060.514632] env[67893]: DEBUG nova.compute.manager [-] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2060.514728] env[67893]: DEBUG nova.network.neutron [-] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2060.538198] env[67893]: DEBUG nova.network.neutron [-] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2060.545911] env[67893]: INFO nova.compute.manager [-] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] Took 0.03 seconds to deallocate network for instance. [ 2060.627007] env[67893]: DEBUG oslo_concurrency.lockutils [None req-fd9152e3-d398-4d70-b0a6-7a3f372f9552 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "2875b0a3-0213-4908-b86b-ce45a8901553" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.159s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2060.627815] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "2875b0a3-0213-4908-b86b-ce45a8901553" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 256.256s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2060.628025] env[67893]: INFO nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 2875b0a3-0213-4908-b86b-ce45a8901553] During sync_power_state the instance has a pending task (deleting). Skip. [ 2060.628211] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "2875b0a3-0213-4908-b86b-ce45a8901553" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2066.346553] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ed988f03-4bec-438c-8228-f9c2afbcfed6 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Acquiring lock "a5151a22-4174-4f66-a83a-55a0dd01c407" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2078.760720] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquiring lock "a71c2ee1-0286-4098-afca-f7666469a95f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2078.760984] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Lock "a71c2ee1-0286-4098-afca-f7666469a95f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2078.772186] env[67893]: DEBUG nova.compute.manager [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2078.823212] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2078.823491] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2078.824915] env[67893]: INFO nova.compute.claims [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2078.985346] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4298463a-ba6e-4495-94bf-fc791420f0bb {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2078.993177] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6be86042-8284-419c-a906-2ee163bf8f16 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2079.021868] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfc6219b-fcd7-4d03-912f-671ac347f68c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2079.028614] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a46e528a-7bc5-4ce4-b206-4cc20041a963 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2079.040998] env[67893]: DEBUG nova.compute.provider_tree [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2079.049170] env[67893]: DEBUG nova.scheduler.client.report [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2079.063354] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.240s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2079.063809] env[67893]: DEBUG nova.compute.manager [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2079.095470] env[67893]: DEBUG nova.compute.utils [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2079.096556] env[67893]: DEBUG nova.compute.manager [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2079.096725] env[67893]: DEBUG nova.network.neutron [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2079.104259] env[67893]: DEBUG nova.compute.manager [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2079.157157] env[67893]: DEBUG nova.policy [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd1016091f2ab4fe69bcf52e8f536bc32', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '88e59a371b0d4dedb303e9b7f6d69b9d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 2079.163322] env[67893]: DEBUG nova.compute.manager [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2079.188157] env[67893]: DEBUG nova.virt.hardware [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2079.188435] env[67893]: DEBUG nova.virt.hardware [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2079.188596] env[67893]: DEBUG nova.virt.hardware [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2079.188779] env[67893]: DEBUG nova.virt.hardware [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2079.188923] env[67893]: DEBUG nova.virt.hardware [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2079.189090] env[67893]: DEBUG nova.virt.hardware [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2079.189304] env[67893]: DEBUG nova.virt.hardware [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2079.189464] env[67893]: DEBUG nova.virt.hardware [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2079.189630] env[67893]: DEBUG nova.virt.hardware [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2079.189792] env[67893]: DEBUG nova.virt.hardware [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2079.189963] env[67893]: DEBUG nova.virt.hardware [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2079.190847] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0134a539-92d2-488f-a50b-dd36720025fd {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2079.198701] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0019f2e-5176-4095-b763-c696496a28db {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2079.446166] env[67893]: DEBUG nova.network.neutron [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Successfully created port: c4ba2b5e-318f-4fc7-88e9-ff3493f4eb8e {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2080.011195] env[67893]: DEBUG nova.compute.manager [req-7fff6ab7-080d-404a-bc8e-926489f266fb req-a92aae4a-6afa-4c9b-8ffc-799ccf027d0a service nova] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Received event network-vif-plugged-c4ba2b5e-318f-4fc7-88e9-ff3493f4eb8e {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2080.011195] env[67893]: DEBUG oslo_concurrency.lockutils [req-7fff6ab7-080d-404a-bc8e-926489f266fb req-a92aae4a-6afa-4c9b-8ffc-799ccf027d0a service nova] Acquiring lock "a71c2ee1-0286-4098-afca-f7666469a95f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2080.011195] env[67893]: DEBUG oslo_concurrency.lockutils [req-7fff6ab7-080d-404a-bc8e-926489f266fb req-a92aae4a-6afa-4c9b-8ffc-799ccf027d0a service nova] Lock "a71c2ee1-0286-4098-afca-f7666469a95f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2080.011195] env[67893]: DEBUG oslo_concurrency.lockutils [req-7fff6ab7-080d-404a-bc8e-926489f266fb req-a92aae4a-6afa-4c9b-8ffc-799ccf027d0a service nova] Lock "a71c2ee1-0286-4098-afca-f7666469a95f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2080.011195] env[67893]: DEBUG nova.compute.manager [req-7fff6ab7-080d-404a-bc8e-926489f266fb req-a92aae4a-6afa-4c9b-8ffc-799ccf027d0a service nova] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] No waiting events found dispatching network-vif-plugged-c4ba2b5e-318f-4fc7-88e9-ff3493f4eb8e {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2080.011706] env[67893]: WARNING nova.compute.manager [req-7fff6ab7-080d-404a-bc8e-926489f266fb req-a92aae4a-6afa-4c9b-8ffc-799ccf027d0a service nova] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Received unexpected event network-vif-plugged-c4ba2b5e-318f-4fc7-88e9-ff3493f4eb8e for instance with vm_state building and task_state spawning. [ 2080.086074] env[67893]: DEBUG nova.network.neutron [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Successfully updated port: c4ba2b5e-318f-4fc7-88e9-ff3493f4eb8e {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2080.101904] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquiring lock "refresh_cache-a71c2ee1-0286-4098-afca-f7666469a95f" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2080.101904] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquired lock "refresh_cache-a71c2ee1-0286-4098-afca-f7666469a95f" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2080.101904] env[67893]: DEBUG nova.network.neutron [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2080.135871] env[67893]: DEBUG nova.network.neutron [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2080.283185] env[67893]: DEBUG nova.network.neutron [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Updating instance_info_cache with network_info: [{"id": "c4ba2b5e-318f-4fc7-88e9-ff3493f4eb8e", "address": "fa:16:3e:32:4e:e1", "network": {"id": "f5f37611-ef93-4a5d-8b1c-169af83eb7a6", "bridge": "br-int", "label": "tempest-ServersTestJSON-1801901943-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "88e59a371b0d4dedb303e9b7f6d69b9d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8c3e2368-4a35-4aa5-9135-23daedbbf9ef", "external-id": "nsx-vlan-transportzone-125", "segmentation_id": 125, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc4ba2b5e-31", "ovs_interfaceid": "c4ba2b5e-318f-4fc7-88e9-ff3493f4eb8e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2080.294795] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Releasing lock "refresh_cache-a71c2ee1-0286-4098-afca-f7666469a95f" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2080.295086] env[67893]: DEBUG nova.compute.manager [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Instance network_info: |[{"id": "c4ba2b5e-318f-4fc7-88e9-ff3493f4eb8e", "address": "fa:16:3e:32:4e:e1", "network": {"id": "f5f37611-ef93-4a5d-8b1c-169af83eb7a6", "bridge": "br-int", "label": "tempest-ServersTestJSON-1801901943-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "88e59a371b0d4dedb303e9b7f6d69b9d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8c3e2368-4a35-4aa5-9135-23daedbbf9ef", "external-id": "nsx-vlan-transportzone-125", "segmentation_id": 125, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc4ba2b5e-31", "ovs_interfaceid": "c4ba2b5e-318f-4fc7-88e9-ff3493f4eb8e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2080.295496] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:32:4e:e1', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8c3e2368-4a35-4aa5-9135-23daedbbf9ef', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c4ba2b5e-318f-4fc7-88e9-ff3493f4eb8e', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2080.303296] env[67893]: DEBUG oslo.service.loopingcall [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2080.303812] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2080.304075] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a78ed486-6777-4164-bf0a-c22f300fcc35 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2080.324666] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2080.324666] env[67893]: value = "task-3455499" [ 2080.324666] env[67893]: _type = "Task" [ 2080.324666] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2080.332207] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455499, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2080.834933] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455499, 'name': CreateVM_Task, 'duration_secs': 0.293264} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2080.835120] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2080.835776] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2080.835936] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2080.836274] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2080.836546] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fa9699bf-bcf6-4fdc-8008-580cda103b86 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2080.840835] env[67893]: DEBUG oslo_vmware.api [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Waiting for the task: (returnval){ [ 2080.840835] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52a7b5a9-5085-aece-2f06-fb19febd2a2a" [ 2080.840835] env[67893]: _type = "Task" [ 2080.840835] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2080.848169] env[67893]: DEBUG oslo_vmware.api [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52a7b5a9-5085-aece-2f06-fb19febd2a2a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2081.351431] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2081.351774] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2081.351944] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2082.038387] env[67893]: DEBUG nova.compute.manager [req-9f1e1e8e-25c0-427e-af62-1edd12c252a5 req-49db2a66-51c4-4a47-9e0b-4d4a513f5cfb service nova] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Received event network-changed-c4ba2b5e-318f-4fc7-88e9-ff3493f4eb8e {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2082.038632] env[67893]: DEBUG nova.compute.manager [req-9f1e1e8e-25c0-427e-af62-1edd12c252a5 req-49db2a66-51c4-4a47-9e0b-4d4a513f5cfb service nova] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Refreshing instance network info cache due to event network-changed-c4ba2b5e-318f-4fc7-88e9-ff3493f4eb8e. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2082.038845] env[67893]: DEBUG oslo_concurrency.lockutils [req-9f1e1e8e-25c0-427e-af62-1edd12c252a5 req-49db2a66-51c4-4a47-9e0b-4d4a513f5cfb service nova] Acquiring lock "refresh_cache-a71c2ee1-0286-4098-afca-f7666469a95f" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2082.038987] env[67893]: DEBUG oslo_concurrency.lockutils [req-9f1e1e8e-25c0-427e-af62-1edd12c252a5 req-49db2a66-51c4-4a47-9e0b-4d4a513f5cfb service nova] Acquired lock "refresh_cache-a71c2ee1-0286-4098-afca-f7666469a95f" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2082.039163] env[67893]: DEBUG nova.network.neutron [req-9f1e1e8e-25c0-427e-af62-1edd12c252a5 req-49db2a66-51c4-4a47-9e0b-4d4a513f5cfb service nova] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Refreshing network info cache for port c4ba2b5e-318f-4fc7-88e9-ff3493f4eb8e {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2082.298134] env[67893]: DEBUG nova.network.neutron [req-9f1e1e8e-25c0-427e-af62-1edd12c252a5 req-49db2a66-51c4-4a47-9e0b-4d4a513f5cfb service nova] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Updated VIF entry in instance network info cache for port c4ba2b5e-318f-4fc7-88e9-ff3493f4eb8e. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2082.298487] env[67893]: DEBUG nova.network.neutron [req-9f1e1e8e-25c0-427e-af62-1edd12c252a5 req-49db2a66-51c4-4a47-9e0b-4d4a513f5cfb service nova] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Updating instance_info_cache with network_info: [{"id": "c4ba2b5e-318f-4fc7-88e9-ff3493f4eb8e", "address": "fa:16:3e:32:4e:e1", "network": {"id": "f5f37611-ef93-4a5d-8b1c-169af83eb7a6", "bridge": "br-int", "label": "tempest-ServersTestJSON-1801901943-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "88e59a371b0d4dedb303e9b7f6d69b9d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8c3e2368-4a35-4aa5-9135-23daedbbf9ef", "external-id": "nsx-vlan-transportzone-125", "segmentation_id": 125, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc4ba2b5e-31", "ovs_interfaceid": "c4ba2b5e-318f-4fc7-88e9-ff3493f4eb8e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2082.307357] env[67893]: DEBUG oslo_concurrency.lockutils [req-9f1e1e8e-25c0-427e-af62-1edd12c252a5 req-49db2a66-51c4-4a47-9e0b-4d4a513f5cfb service nova] Releasing lock "refresh_cache-a71c2ee1-0286-4098-afca-f7666469a95f" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2094.197844] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2096.854820] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2096.858407] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2097.858570] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2097.858853] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2097.858853] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2097.880661] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2097.880843] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2097.880977] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2097.881145] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2097.881279] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2097.881400] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 11000d92-0094-4561-a807-ca76610ea549] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2097.881518] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2097.881633] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2097.881749] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2097.881865] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2097.881983] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2097.882524] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2097.882772] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2097.882866] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2097.883042] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2097.883169] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Cleaning up deleted instances with incomplete migration {{(pid=67893) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 2100.866661] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2101.858729] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2102.859059] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2102.859059] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Cleaning up deleted instances {{(pid=67893) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 2102.868445] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] There are 0 instances to clean {{(pid=67893) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 2104.869655] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2104.881270] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2104.881476] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2104.881646] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2104.881803] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2104.883311] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7308e86a-0005-48c2-9e90-6f82db8a3c58 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2104.891354] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18ed03bd-9482-4d61-ba6d-41ef3d9c1ce5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2104.904663] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79c0f2b1-1d1a-429d-8c83-6cad3fef0940 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2104.910516] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7ebe648-f9b1-4ec8-9e92-cedc606dfc8b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2104.938650] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180988MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2104.938830] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2104.939012] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2105.041757] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance dfb92d1c-c2a5-49c1-8526-3743cb385c97 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2105.041941] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance ad60df35-54c0-459e-8a25-981922ae0a88 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2105.042123] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 9fc9f6b0-928e-46b4-ad7c-9217b2f31575 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2105.042261] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 15893b5f-a02a-4ce7-80c9-eea0658f9ac7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2105.042382] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 94760898-4f3c-4f41-85be-366f4108d0ba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2105.042501] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 11000d92-0094-4561-a807-ca76610ea549 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2105.042617] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 7169c720-f69e-40a3-95d2-473639884cd9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2105.042736] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 72410dc2-74d9-4d59-bdd1-ad45b01c482b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2105.042875] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a5151a22-4174-4f66-a83a-55a0dd01c407 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2105.042999] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a71c2ee1-0286-4098-afca-f7666469a95f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2105.043202] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2105.043339] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2105.058591] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Refreshing inventories for resource provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 2105.071223] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Updating ProviderTree inventory for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 2105.071403] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Updating inventory in ProviderTree for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2105.081336] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Refreshing aggregate associations for resource provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57, aggregates: None {{(pid=67893) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 2105.097850] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Refreshing trait associations for resource provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO {{(pid=67893) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 2105.211171] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e92cad79-f1f4-4381-b09a-b079839e8bdc {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2105.219966] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17c885e0-7413-4517-88bc-8c74896db38e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2105.248376] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-839db999-82c4-4efc-a339-9fe8d4cdcb22 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2105.255477] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ea74784-ce85-4008-ba92-9342f88d80a6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2105.267915] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2105.276509] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2105.289645] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2105.289814] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.351s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2106.859361] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2108.981700] env[67893]: WARNING oslo_vmware.rw_handles [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2108.981700] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2108.981700] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2108.981700] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2108.981700] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2108.981700] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 2108.981700] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2108.981700] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2108.981700] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2108.981700] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2108.981700] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2108.981700] env[67893]: ERROR oslo_vmware.rw_handles [ 2108.982322] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/4b83264d-4dc7-493e-adf6-7d785bb48a11/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2108.984379] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2108.984617] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Copying Virtual Disk [datastore1] vmware_temp/4b83264d-4dc7-493e-adf6-7d785bb48a11/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/4b83264d-4dc7-493e-adf6-7d785bb48a11/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2108.984917] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-fb82e3dd-3057-4d4e-b45b-6a4586035a99 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2108.992651] env[67893]: DEBUG oslo_vmware.api [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Waiting for the task: (returnval){ [ 2108.992651] env[67893]: value = "task-3455500" [ 2108.992651] env[67893]: _type = "Task" [ 2108.992651] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2109.000208] env[67893]: DEBUG oslo_vmware.api [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Task: {'id': task-3455500, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2109.503291] env[67893]: DEBUG oslo_vmware.exceptions [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2109.503546] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2109.504139] env[67893]: ERROR nova.compute.manager [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2109.504139] env[67893]: Faults: ['InvalidArgument'] [ 2109.504139] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Traceback (most recent call last): [ 2109.504139] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2109.504139] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] yield resources [ 2109.504139] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2109.504139] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] self.driver.spawn(context, instance, image_meta, [ 2109.504139] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2109.504139] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2109.504139] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2109.504139] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] self._fetch_image_if_missing(context, vi) [ 2109.504139] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2109.504139] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] image_cache(vi, tmp_image_ds_loc) [ 2109.504139] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2109.504139] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] vm_util.copy_virtual_disk( [ 2109.504139] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2109.504139] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] session._wait_for_task(vmdk_copy_task) [ 2109.504139] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2109.504139] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] return self.wait_for_task(task_ref) [ 2109.504139] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2109.504139] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] return evt.wait() [ 2109.504139] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2109.504139] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] result = hub.switch() [ 2109.504139] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2109.504139] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] return self.greenlet.switch() [ 2109.504139] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2109.504139] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] self.f(*self.args, **self.kw) [ 2109.504139] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2109.504139] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] raise exceptions.translate_fault(task_info.error) [ 2109.504139] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2109.504139] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Faults: ['InvalidArgument'] [ 2109.504139] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] [ 2109.504852] env[67893]: INFO nova.compute.manager [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Terminating instance [ 2109.506058] env[67893]: DEBUG oslo_concurrency.lockutils [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2109.506292] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2109.506539] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d85c06ac-da64-4ce1-8db5-16309b479f67 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2109.508683] env[67893]: DEBUG nova.compute.manager [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2109.508879] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2109.509648] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec6b445c-23ff-4e33-8208-837dd3b641f4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2109.517861] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2109.518099] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-04e2bf08-b90e-4710-b984-8c1f2a0adf9d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2109.520153] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2109.520320] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2109.521254] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e3b49be7-fa09-47ef-befe-e7848e67a360 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2109.525857] env[67893]: DEBUG oslo_vmware.api [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Waiting for the task: (returnval){ [ 2109.525857] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]528e096e-c6df-b5b4-fe3d-e162129f6dc7" [ 2109.525857] env[67893]: _type = "Task" [ 2109.525857] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2109.532701] env[67893]: DEBUG oslo_vmware.api [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]528e096e-c6df-b5b4-fe3d-e162129f6dc7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2109.596015] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2109.596308] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2109.596442] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Deleting the datastore file [datastore1] dfb92d1c-c2a5-49c1-8526-3743cb385c97 {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2109.596710] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1bcad634-9150-433f-8385-61158ece891a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2109.602268] env[67893]: DEBUG oslo_vmware.api [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Waiting for the task: (returnval){ [ 2109.602268] env[67893]: value = "task-3455502" [ 2109.602268] env[67893]: _type = "Task" [ 2109.602268] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2109.609583] env[67893]: DEBUG oslo_vmware.api [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Task: {'id': task-3455502, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2110.036484] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2110.036780] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Creating directory with path [datastore1] vmware_temp/0da480ed-c682-4369-8737-9589d4eee024/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2110.036975] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3d9ad617-7d3f-442a-8a1b-362a7d234278 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2110.048185] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Created directory with path [datastore1] vmware_temp/0da480ed-c682-4369-8737-9589d4eee024/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2110.048379] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Fetch image to [datastore1] vmware_temp/0da480ed-c682-4369-8737-9589d4eee024/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2110.048550] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/0da480ed-c682-4369-8737-9589d4eee024/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2110.049295] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84a7e82f-dd02-47d9-986a-fc61636cecf0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2110.056321] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de7e0f9a-2116-4d04-90fe-9e31eceb2af3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2110.065014] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54cbab5e-793a-4a25-bc27-9272458fb891 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2110.094370] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e7c5b69-1e8d-405c-a77c-633a5f6b4447 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2110.099601] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-60a9a03a-990c-423f-92e9-d083937a14d9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2110.110246] env[67893]: DEBUG oslo_vmware.api [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Task: {'id': task-3455502, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074732} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2110.110475] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2110.110655] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2110.110824] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2110.111009] env[67893]: INFO nova.compute.manager [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2110.113061] env[67893]: DEBUG nova.compute.claims [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2110.113239] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2110.113450] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2110.123910] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2110.278813] env[67893]: DEBUG oslo_vmware.rw_handles [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0da480ed-c682-4369-8737-9589d4eee024/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2110.335794] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8fbb1087-424f-431b-88ae-ccd58ad8a3af {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2110.340327] env[67893]: DEBUG oslo_vmware.rw_handles [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2110.340499] env[67893]: DEBUG oslo_vmware.rw_handles [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0da480ed-c682-4369-8737-9589d4eee024/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2110.344080] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c67df35-8f43-421c-9a67-58a5d914434c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2110.375167] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a630a51-76ff-4fc2-aaac-7795a20c4524 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2110.383019] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5736c4b5-9fe1-4e60-8638-c6195c5e691a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2110.395108] env[67893]: DEBUG nova.compute.provider_tree [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2110.405156] env[67893]: DEBUG nova.scheduler.client.report [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2110.420295] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.307s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2110.420900] env[67893]: ERROR nova.compute.manager [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2110.420900] env[67893]: Faults: ['InvalidArgument'] [ 2110.420900] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Traceback (most recent call last): [ 2110.420900] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2110.420900] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] self.driver.spawn(context, instance, image_meta, [ 2110.420900] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2110.420900] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2110.420900] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2110.420900] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] self._fetch_image_if_missing(context, vi) [ 2110.420900] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2110.420900] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] image_cache(vi, tmp_image_ds_loc) [ 2110.420900] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2110.420900] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] vm_util.copy_virtual_disk( [ 2110.420900] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2110.420900] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] session._wait_for_task(vmdk_copy_task) [ 2110.420900] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2110.420900] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] return self.wait_for_task(task_ref) [ 2110.420900] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2110.420900] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] return evt.wait() [ 2110.420900] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2110.420900] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] result = hub.switch() [ 2110.420900] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2110.420900] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] return self.greenlet.switch() [ 2110.420900] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2110.420900] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] self.f(*self.args, **self.kw) [ 2110.420900] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2110.420900] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] raise exceptions.translate_fault(task_info.error) [ 2110.420900] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2110.420900] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Faults: ['InvalidArgument'] [ 2110.420900] env[67893]: ERROR nova.compute.manager [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] [ 2110.421697] env[67893]: DEBUG nova.compute.utils [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2110.424033] env[67893]: DEBUG nova.compute.manager [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Build of instance dfb92d1c-c2a5-49c1-8526-3743cb385c97 was re-scheduled: A specified parameter was not correct: fileType [ 2110.424033] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2110.424033] env[67893]: DEBUG nova.compute.manager [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2110.424033] env[67893]: DEBUG nova.compute.manager [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2110.424033] env[67893]: DEBUG nova.compute.manager [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2110.424375] env[67893]: DEBUG nova.network.neutron [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2110.816447] env[67893]: DEBUG nova.network.neutron [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2110.828722] env[67893]: INFO nova.compute.manager [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Took 0.40 seconds to deallocate network for instance. [ 2110.932086] env[67893]: INFO nova.scheduler.client.report [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Deleted allocations for instance dfb92d1c-c2a5-49c1-8526-3743cb385c97 [ 2110.955065] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dbf548bd-9249-47db-87ca-60362cae31cf tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Lock "dfb92d1c-c2a5-49c1-8526-3743cb385c97" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 652.489s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2110.955344] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d7d99f6e-fbaa-4229-aea0-700e12581091 tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Lock "dfb92d1c-c2a5-49c1-8526-3743cb385c97" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 456.972s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2110.955568] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d7d99f6e-fbaa-4229-aea0-700e12581091 tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Acquiring lock "dfb92d1c-c2a5-49c1-8526-3743cb385c97-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2110.955774] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d7d99f6e-fbaa-4229-aea0-700e12581091 tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Lock "dfb92d1c-c2a5-49c1-8526-3743cb385c97-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2110.955933] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d7d99f6e-fbaa-4229-aea0-700e12581091 tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Lock "dfb92d1c-c2a5-49c1-8526-3743cb385c97-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2110.957875] env[67893]: INFO nova.compute.manager [None req-d7d99f6e-fbaa-4229-aea0-700e12581091 tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Terminating instance [ 2110.959913] env[67893]: DEBUG nova.compute.manager [None req-d7d99f6e-fbaa-4229-aea0-700e12581091 tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2110.960123] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-d7d99f6e-fbaa-4229-aea0-700e12581091 tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2110.960389] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8c4a2a5c-f50a-48c3-8bfc-cd7a7b4cbbe9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2110.970048] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a64c9750-69cc-488f-813b-892eb1db6042 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2110.999645] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-d7d99f6e-fbaa-4229-aea0-700e12581091 tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance dfb92d1c-c2a5-49c1-8526-3743cb385c97 could not be found. [ 2110.999645] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-d7d99f6e-fbaa-4229-aea0-700e12581091 tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2110.999757] env[67893]: INFO nova.compute.manager [None req-d7d99f6e-fbaa-4229-aea0-700e12581091 tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2111.000028] env[67893]: DEBUG oslo.service.loopingcall [None req-d7d99f6e-fbaa-4229-aea0-700e12581091 tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2111.000251] env[67893]: DEBUG nova.compute.manager [-] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2111.000550] env[67893]: DEBUG nova.network.neutron [-] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2111.024762] env[67893]: DEBUG nova.network.neutron [-] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2111.033059] env[67893]: INFO nova.compute.manager [-] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] Took 0.03 seconds to deallocate network for instance. [ 2111.121634] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d7d99f6e-fbaa-4229-aea0-700e12581091 tempest-ServerActionsTestOtherA-1715011108 tempest-ServerActionsTestOtherA-1715011108-project-member] Lock "dfb92d1c-c2a5-49c1-8526-3743cb385c97" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.166s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2111.123333] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "dfb92d1c-c2a5-49c1-8526-3743cb385c97" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 306.751s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2111.123585] env[67893]: INFO nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: dfb92d1c-c2a5-49c1-8526-3743cb385c97] During sync_power_state the instance has a pending task (deleting). Skip. [ 2111.123677] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "dfb92d1c-c2a5-49c1-8526-3743cb385c97" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2154.866711] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2156.860011] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2157.854809] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2157.858392] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2158.859344] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2158.859688] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2158.859688] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2158.879158] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2158.879307] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2158.879438] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2158.879570] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2158.879690] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 11000d92-0094-4561-a807-ca76610ea549] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2158.879811] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2158.879929] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2158.880058] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2158.880181] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2158.880301] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2159.859552] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2159.859552] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2160.104063] env[67893]: WARNING oslo_vmware.rw_handles [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2160.104063] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2160.104063] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2160.104063] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2160.104063] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2160.104063] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 2160.104063] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2160.104063] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2160.104063] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2160.104063] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2160.104063] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2160.104063] env[67893]: ERROR oslo_vmware.rw_handles [ 2160.104063] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/0da480ed-c682-4369-8737-9589d4eee024/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2160.105908] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2160.106195] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Copying Virtual Disk [datastore1] vmware_temp/0da480ed-c682-4369-8737-9589d4eee024/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/0da480ed-c682-4369-8737-9589d4eee024/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2160.106483] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-892dd88f-7d1f-4341-803c-fba227d75fd1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2160.114709] env[67893]: DEBUG oslo_vmware.api [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Waiting for the task: (returnval){ [ 2160.114709] env[67893]: value = "task-3455503" [ 2160.114709] env[67893]: _type = "Task" [ 2160.114709] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2160.124961] env[67893]: DEBUG oslo_vmware.api [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Task: {'id': task-3455503, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2160.626078] env[67893]: DEBUG oslo_vmware.exceptions [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2160.626078] env[67893]: DEBUG oslo_concurrency.lockutils [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2160.626265] env[67893]: ERROR nova.compute.manager [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2160.626265] env[67893]: Faults: ['InvalidArgument'] [ 2160.626265] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Traceback (most recent call last): [ 2160.626265] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2160.626265] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] yield resources [ 2160.626265] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2160.626265] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] self.driver.spawn(context, instance, image_meta, [ 2160.626265] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2160.626265] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2160.626265] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2160.626265] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] self._fetch_image_if_missing(context, vi) [ 2160.626265] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2160.626265] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] image_cache(vi, tmp_image_ds_loc) [ 2160.626265] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2160.626265] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] vm_util.copy_virtual_disk( [ 2160.626265] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2160.626265] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] session._wait_for_task(vmdk_copy_task) [ 2160.626265] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2160.626265] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] return self.wait_for_task(task_ref) [ 2160.626265] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2160.626265] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] return evt.wait() [ 2160.626265] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2160.626265] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] result = hub.switch() [ 2160.626265] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2160.626265] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] return self.greenlet.switch() [ 2160.626265] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2160.626265] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] self.f(*self.args, **self.kw) [ 2160.626265] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2160.626265] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] raise exceptions.translate_fault(task_info.error) [ 2160.626265] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2160.626265] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Faults: ['InvalidArgument'] [ 2160.626265] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] [ 2160.627150] env[67893]: INFO nova.compute.manager [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Terminating instance [ 2160.628048] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2160.628260] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2160.628497] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a2c9ad2d-f450-419b-a110-dbd2c404ebc3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2160.630593] env[67893]: DEBUG nova.compute.manager [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2160.630782] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2160.631484] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f77642e-fa76-4482-b96c-629fe9e31be6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2160.637941] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2160.638162] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-361ea2c4-3892-4014-b499-afe26da665cf {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2160.640189] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2160.640363] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2160.641304] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b61a02dc-2f71-4488-ab07-f231fe92ccba {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2160.645890] env[67893]: DEBUG oslo_vmware.api [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Waiting for the task: (returnval){ [ 2160.645890] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52d46941-44c3-2eb0-3c65-a185431b9387" [ 2160.645890] env[67893]: _type = "Task" [ 2160.645890] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2160.652648] env[67893]: DEBUG oslo_vmware.api [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52d46941-44c3-2eb0-3c65-a185431b9387, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2160.706751] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2160.707018] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2160.707165] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Deleting the datastore file [datastore1] ad60df35-54c0-459e-8a25-981922ae0a88 {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2160.707411] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6de92ffa-1186-4bf9-8caf-ba6ab9e19b72 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2160.714149] env[67893]: DEBUG oslo_vmware.api [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Waiting for the task: (returnval){ [ 2160.714149] env[67893]: value = "task-3455505" [ 2160.714149] env[67893]: _type = "Task" [ 2160.714149] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2160.722335] env[67893]: DEBUG oslo_vmware.api [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Task: {'id': task-3455505, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2161.156510] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2161.156968] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Creating directory with path [datastore1] vmware_temp/b9e0cdf7-0d57-4d38-af05-5b819d4457bf/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2161.157032] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8015d127-e2dc-4e67-ba71-bd8f6540128b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2161.169036] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Created directory with path [datastore1] vmware_temp/b9e0cdf7-0d57-4d38-af05-5b819d4457bf/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2161.169225] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Fetch image to [datastore1] vmware_temp/b9e0cdf7-0d57-4d38-af05-5b819d4457bf/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2161.169390] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/b9e0cdf7-0d57-4d38-af05-5b819d4457bf/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2161.170094] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-855b6d4f-5ac9-4620-b294-7ec58d926cd8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2161.176506] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edb2c952-581d-4d86-8d4b-20e3761eafbd {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2161.185069] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7046d398-ea93-48ff-a605-563add48c7fe {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2161.214644] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7beac593-6d2e-410f-b281-6d38e1718653 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2161.224950] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8a767ab9-abda-41b2-a0c3-6c92ef2ab834 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2161.226558] env[67893]: DEBUG oslo_vmware.api [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Task: {'id': task-3455505, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077843} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2161.226789] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2161.226966] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2161.227153] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2161.227323] env[67893]: INFO nova.compute.manager [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2161.229416] env[67893]: DEBUG nova.compute.claims [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2161.229602] env[67893]: DEBUG oslo_concurrency.lockutils [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2161.229821] env[67893]: DEBUG oslo_concurrency.lockutils [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2161.251066] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2161.306272] env[67893]: DEBUG oslo_vmware.rw_handles [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b9e0cdf7-0d57-4d38-af05-5b819d4457bf/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2161.365931] env[67893]: DEBUG oslo_vmware.rw_handles [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2161.366194] env[67893]: DEBUG oslo_vmware.rw_handles [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b9e0cdf7-0d57-4d38-af05-5b819d4457bf/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2161.460082] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5bdc3b48-2611-4d70-ba34-e7ba0a808160 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2161.467740] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7725521e-02f1-4407-b343-64eb0e76f3f3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2161.498036] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf47dff5-4c2b-4309-9082-29eac20af32c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2161.505016] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-255ae58a-689a-4c82-b52f-3bef3949d2aa {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2161.517526] env[67893]: DEBUG nova.compute.provider_tree [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2161.525598] env[67893]: DEBUG nova.scheduler.client.report [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2161.541099] env[67893]: DEBUG oslo_concurrency.lockutils [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.311s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2161.541635] env[67893]: ERROR nova.compute.manager [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2161.541635] env[67893]: Faults: ['InvalidArgument'] [ 2161.541635] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Traceback (most recent call last): [ 2161.541635] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2161.541635] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] self.driver.spawn(context, instance, image_meta, [ 2161.541635] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2161.541635] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2161.541635] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2161.541635] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] self._fetch_image_if_missing(context, vi) [ 2161.541635] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2161.541635] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] image_cache(vi, tmp_image_ds_loc) [ 2161.541635] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2161.541635] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] vm_util.copy_virtual_disk( [ 2161.541635] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2161.541635] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] session._wait_for_task(vmdk_copy_task) [ 2161.541635] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2161.541635] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] return self.wait_for_task(task_ref) [ 2161.541635] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2161.541635] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] return evt.wait() [ 2161.541635] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2161.541635] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] result = hub.switch() [ 2161.541635] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2161.541635] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] return self.greenlet.switch() [ 2161.541635] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2161.541635] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] self.f(*self.args, **self.kw) [ 2161.541635] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2161.541635] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] raise exceptions.translate_fault(task_info.error) [ 2161.541635] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2161.541635] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Faults: ['InvalidArgument'] [ 2161.541635] env[67893]: ERROR nova.compute.manager [instance: ad60df35-54c0-459e-8a25-981922ae0a88] [ 2161.542348] env[67893]: DEBUG nova.compute.utils [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2161.543695] env[67893]: DEBUG nova.compute.manager [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Build of instance ad60df35-54c0-459e-8a25-981922ae0a88 was re-scheduled: A specified parameter was not correct: fileType [ 2161.543695] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2161.544077] env[67893]: DEBUG nova.compute.manager [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2161.544256] env[67893]: DEBUG nova.compute.manager [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2161.544427] env[67893]: DEBUG nova.compute.manager [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2161.544588] env[67893]: DEBUG nova.network.neutron [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2161.859480] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2162.074419] env[67893]: DEBUG nova.network.neutron [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2162.088089] env[67893]: INFO nova.compute.manager [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Took 0.54 seconds to deallocate network for instance. [ 2162.186394] env[67893]: INFO nova.scheduler.client.report [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Deleted allocations for instance ad60df35-54c0-459e-8a25-981922ae0a88 [ 2162.210024] env[67893]: DEBUG oslo_concurrency.lockutils [None req-388b66e7-a5ef-4969-ad67-ec72714b3315 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Lock "ad60df35-54c0-459e-8a25-981922ae0a88" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 689.678s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2162.210024] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dc4e829b-0b43-47eb-b00f-32763c950794 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Lock "ad60df35-54c0-459e-8a25-981922ae0a88" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 493.734s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2162.210197] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dc4e829b-0b43-47eb-b00f-32763c950794 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Acquiring lock "ad60df35-54c0-459e-8a25-981922ae0a88-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2162.210312] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dc4e829b-0b43-47eb-b00f-32763c950794 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Lock "ad60df35-54c0-459e-8a25-981922ae0a88-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2162.210507] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dc4e829b-0b43-47eb-b00f-32763c950794 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Lock "ad60df35-54c0-459e-8a25-981922ae0a88-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2162.212878] env[67893]: INFO nova.compute.manager [None req-dc4e829b-0b43-47eb-b00f-32763c950794 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Terminating instance [ 2162.219269] env[67893]: DEBUG nova.compute.manager [None req-dc4e829b-0b43-47eb-b00f-32763c950794 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2162.219269] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-dc4e829b-0b43-47eb-b00f-32763c950794 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2162.219269] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5abee68e-97cd-44f9-bc48-4c2f871e98de {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2162.224386] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cddff1c1-2816-4433-8408-e5284a49d1e7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2162.253960] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-dc4e829b-0b43-47eb-b00f-32763c950794 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ad60df35-54c0-459e-8a25-981922ae0a88 could not be found. [ 2162.254175] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-dc4e829b-0b43-47eb-b00f-32763c950794 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2162.254375] env[67893]: INFO nova.compute.manager [None req-dc4e829b-0b43-47eb-b00f-32763c950794 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2162.254630] env[67893]: DEBUG oslo.service.loopingcall [None req-dc4e829b-0b43-47eb-b00f-32763c950794 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2162.254869] env[67893]: DEBUG nova.compute.manager [-] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2162.254962] env[67893]: DEBUG nova.network.neutron [-] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2162.285271] env[67893]: DEBUG nova.network.neutron [-] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2162.294322] env[67893]: INFO nova.compute.manager [-] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] Took 0.04 seconds to deallocate network for instance. [ 2162.387169] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dc4e829b-0b43-47eb-b00f-32763c950794 tempest-ImagesTestJSON-2118872471 tempest-ImagesTestJSON-2118872471-project-member] Lock "ad60df35-54c0-459e-8a25-981922ae0a88" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.177s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2162.388948] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "ad60df35-54c0-459e-8a25-981922ae0a88" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 358.017s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2162.389280] env[67893]: INFO nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: ad60df35-54c0-459e-8a25-981922ae0a88] During sync_power_state the instance has a pending task (deleting). Skip. [ 2162.389564] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "ad60df35-54c0-459e-8a25-981922ae0a88" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2162.859313] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2163.854429] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2164.860074] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2164.870360] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2164.870589] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2164.870755] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2164.870910] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2164.872465] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd9881d3-9950-44d4-849c-e079a178174c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2164.880734] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8f1def4-8bb0-4fba-acdb-34697e082ddf {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2164.895519] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6785548-b43b-4e65-af6f-cd8eebba0642 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2164.901719] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b12ce9dc-cd85-4b5d-9cae-7e5f417681eb {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2164.930169] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180961MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2164.930358] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2164.930597] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2164.994999] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 9fc9f6b0-928e-46b4-ad7c-9217b2f31575 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2164.995185] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 15893b5f-a02a-4ce7-80c9-eea0658f9ac7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2164.995317] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 94760898-4f3c-4f41-85be-366f4108d0ba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2164.995442] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 11000d92-0094-4561-a807-ca76610ea549 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2164.995563] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 7169c720-f69e-40a3-95d2-473639884cd9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2164.995680] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 72410dc2-74d9-4d59-bdd1-ad45b01c482b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2164.995834] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a5151a22-4174-4f66-a83a-55a0dd01c407 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2164.995970] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a71c2ee1-0286-4098-afca-f7666469a95f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2164.996168] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2164.996308] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2165.090365] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80a39bfe-d690-4095-9061-9a866a4f210f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2165.097728] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10dbd024-85ad-43fd-a4f0-95c7ddeaf1fc {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2165.126669] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7029f5a-dcee-499b-90c1-aaa6f46509b9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2165.133486] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-456637eb-f3b3-44e0-84ec-db6d091d0b8b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2165.145937] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2165.153911] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2165.168722] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2165.168905] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.238s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2167.587693] env[67893]: DEBUG oslo_concurrency.lockutils [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "d0f623f5-88e9-4806-8f30-584d277ba5fe" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2167.588014] env[67893]: DEBUG oslo_concurrency.lockutils [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "d0f623f5-88e9-4806-8f30-584d277ba5fe" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2167.598607] env[67893]: DEBUG nova.compute.manager [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2167.648054] env[67893]: DEBUG oslo_concurrency.lockutils [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2167.648302] env[67893]: DEBUG oslo_concurrency.lockutils [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2167.649711] env[67893]: INFO nova.compute.claims [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2167.797962] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77fe3483-61b0-4796-ad6d-3ebc82a3cea1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2167.805231] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3450c3e-24bd-4f44-9513-8d638dd604f4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2167.834084] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ed5d387-f1ef-487f-bac4-f11a9f49d9c8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2167.840411] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-794a308a-d9b8-441b-bd77-92f0db66c8e2 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2167.852835] env[67893]: DEBUG nova.compute.provider_tree [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2167.861853] env[67893]: DEBUG nova.scheduler.client.report [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2167.876574] env[67893]: DEBUG oslo_concurrency.lockutils [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.228s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2167.877036] env[67893]: DEBUG nova.compute.manager [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2167.908297] env[67893]: DEBUG nova.compute.utils [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2167.909511] env[67893]: DEBUG nova.compute.manager [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2167.909651] env[67893]: DEBUG nova.network.neutron [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2167.943026] env[67893]: DEBUG nova.compute.manager [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2167.981886] env[67893]: DEBUG nova.policy [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9115f73c22bf4b0e9e5439363832061d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7a19d9bde3814325847c06cec1af09b7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 2168.004984] env[67893]: DEBUG nova.compute.manager [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2168.028608] env[67893]: DEBUG nova.virt.hardware [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2168.028860] env[67893]: DEBUG nova.virt.hardware [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2168.029030] env[67893]: DEBUG nova.virt.hardware [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2168.029218] env[67893]: DEBUG nova.virt.hardware [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2168.029364] env[67893]: DEBUG nova.virt.hardware [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2168.029512] env[67893]: DEBUG nova.virt.hardware [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2168.029720] env[67893]: DEBUG nova.virt.hardware [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2168.029880] env[67893]: DEBUG nova.virt.hardware [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2168.030138] env[67893]: DEBUG nova.virt.hardware [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2168.030372] env[67893]: DEBUG nova.virt.hardware [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2168.030623] env[67893]: DEBUG nova.virt.hardware [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2168.031505] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f3abb33-e1c1-4d4e-8755-9d36ca3f07ce {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2168.041900] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0130f81-176d-45a9-a91f-fc6236a9b5d8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2168.363623] env[67893]: DEBUG nova.network.neutron [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Successfully created port: 887d5ac0-1fec-4e07-a5fc-3e676a540d8f {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2168.885948] env[67893]: DEBUG nova.compute.manager [req-38616882-4c1c-4a2b-b67d-f48b4779b2bc req-f7456abb-26b1-466e-81cb-23e9b3acaee4 service nova] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Received event network-vif-plugged-887d5ac0-1fec-4e07-a5fc-3e676a540d8f {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2168.886249] env[67893]: DEBUG oslo_concurrency.lockutils [req-38616882-4c1c-4a2b-b67d-f48b4779b2bc req-f7456abb-26b1-466e-81cb-23e9b3acaee4 service nova] Acquiring lock "d0f623f5-88e9-4806-8f30-584d277ba5fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2168.886435] env[67893]: DEBUG oslo_concurrency.lockutils [req-38616882-4c1c-4a2b-b67d-f48b4779b2bc req-f7456abb-26b1-466e-81cb-23e9b3acaee4 service nova] Lock "d0f623f5-88e9-4806-8f30-584d277ba5fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2168.886597] env[67893]: DEBUG oslo_concurrency.lockutils [req-38616882-4c1c-4a2b-b67d-f48b4779b2bc req-f7456abb-26b1-466e-81cb-23e9b3acaee4 service nova] Lock "d0f623f5-88e9-4806-8f30-584d277ba5fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2168.886764] env[67893]: DEBUG nova.compute.manager [req-38616882-4c1c-4a2b-b67d-f48b4779b2bc req-f7456abb-26b1-466e-81cb-23e9b3acaee4 service nova] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] No waiting events found dispatching network-vif-plugged-887d5ac0-1fec-4e07-a5fc-3e676a540d8f {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2168.886930] env[67893]: WARNING nova.compute.manager [req-38616882-4c1c-4a2b-b67d-f48b4779b2bc req-f7456abb-26b1-466e-81cb-23e9b3acaee4 service nova] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Received unexpected event network-vif-plugged-887d5ac0-1fec-4e07-a5fc-3e676a540d8f for instance with vm_state building and task_state spawning. [ 2168.965744] env[67893]: DEBUG nova.network.neutron [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Successfully updated port: 887d5ac0-1fec-4e07-a5fc-3e676a540d8f {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2168.977505] env[67893]: DEBUG oslo_concurrency.lockutils [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "refresh_cache-d0f623f5-88e9-4806-8f30-584d277ba5fe" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2168.977737] env[67893]: DEBUG oslo_concurrency.lockutils [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquired lock "refresh_cache-d0f623f5-88e9-4806-8f30-584d277ba5fe" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2168.978059] env[67893]: DEBUG nova.network.neutron [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2169.020664] env[67893]: DEBUG nova.network.neutron [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2169.175557] env[67893]: DEBUG nova.network.neutron [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Updating instance_info_cache with network_info: [{"id": "887d5ac0-1fec-4e07-a5fc-3e676a540d8f", "address": "fa:16:3e:47:9c:4d", "network": {"id": "b5038471-f3b2-4f1f-b2f9-62effa71f1aa", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1405799721-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7a19d9bde3814325847c06cec1af09b7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "89ef02af-c508-432f-ae29-3a219701d584", "external-id": "nsx-vlan-transportzone-313", "segmentation_id": 313, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap887d5ac0-1f", "ovs_interfaceid": "887d5ac0-1fec-4e07-a5fc-3e676a540d8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2169.187659] env[67893]: DEBUG oslo_concurrency.lockutils [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Releasing lock "refresh_cache-d0f623f5-88e9-4806-8f30-584d277ba5fe" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2169.187944] env[67893]: DEBUG nova.compute.manager [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Instance network_info: |[{"id": "887d5ac0-1fec-4e07-a5fc-3e676a540d8f", "address": "fa:16:3e:47:9c:4d", "network": {"id": "b5038471-f3b2-4f1f-b2f9-62effa71f1aa", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1405799721-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7a19d9bde3814325847c06cec1af09b7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "89ef02af-c508-432f-ae29-3a219701d584", "external-id": "nsx-vlan-transportzone-313", "segmentation_id": 313, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap887d5ac0-1f", "ovs_interfaceid": "887d5ac0-1fec-4e07-a5fc-3e676a540d8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2169.188343] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:47:9c:4d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '89ef02af-c508-432f-ae29-3a219701d584', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '887d5ac0-1fec-4e07-a5fc-3e676a540d8f', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2169.195923] env[67893]: DEBUG oslo.service.loopingcall [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2169.196396] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2169.196628] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ead94be8-0c07-419e-9596-db96d7cffe27 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2169.216828] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2169.216828] env[67893]: value = "task-3455506" [ 2169.216828] env[67893]: _type = "Task" [ 2169.216828] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2169.224402] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455506, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2169.727072] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455506, 'name': CreateVM_Task, 'duration_secs': 0.301069} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2169.727246] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2169.727893] env[67893]: DEBUG oslo_concurrency.lockutils [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2169.728069] env[67893]: DEBUG oslo_concurrency.lockutils [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2169.728401] env[67893]: DEBUG oslo_concurrency.lockutils [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2169.728641] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fb0ae507-480e-4eee-b3bf-f9a66842e8f4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2169.733371] env[67893]: DEBUG oslo_vmware.api [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Waiting for the task: (returnval){ [ 2169.733371] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52dbe2b6-afa4-bdf9-1e19-8cb17a125313" [ 2169.733371] env[67893]: _type = "Task" [ 2169.733371] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2169.740577] env[67893]: DEBUG oslo_vmware.api [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52dbe2b6-afa4-bdf9-1e19-8cb17a125313, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2170.243625] env[67893]: DEBUG oslo_concurrency.lockutils [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2170.244104] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2170.244187] env[67893]: DEBUG oslo_concurrency.lockutils [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2170.915947] env[67893]: DEBUG nova.compute.manager [req-d4417d5c-623e-411d-a91c-a8b415b8806b req-210fda17-2168-4096-aa8e-618c92aa5c3b service nova] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Received event network-changed-887d5ac0-1fec-4e07-a5fc-3e676a540d8f {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2170.916123] env[67893]: DEBUG nova.compute.manager [req-d4417d5c-623e-411d-a91c-a8b415b8806b req-210fda17-2168-4096-aa8e-618c92aa5c3b service nova] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Refreshing instance network info cache due to event network-changed-887d5ac0-1fec-4e07-a5fc-3e676a540d8f. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2170.916358] env[67893]: DEBUG oslo_concurrency.lockutils [req-d4417d5c-623e-411d-a91c-a8b415b8806b req-210fda17-2168-4096-aa8e-618c92aa5c3b service nova] Acquiring lock "refresh_cache-d0f623f5-88e9-4806-8f30-584d277ba5fe" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2170.916502] env[67893]: DEBUG oslo_concurrency.lockutils [req-d4417d5c-623e-411d-a91c-a8b415b8806b req-210fda17-2168-4096-aa8e-618c92aa5c3b service nova] Acquired lock "refresh_cache-d0f623f5-88e9-4806-8f30-584d277ba5fe" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2170.916663] env[67893]: DEBUG nova.network.neutron [req-d4417d5c-623e-411d-a91c-a8b415b8806b req-210fda17-2168-4096-aa8e-618c92aa5c3b service nova] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Refreshing network info cache for port 887d5ac0-1fec-4e07-a5fc-3e676a540d8f {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2171.167464] env[67893]: DEBUG nova.network.neutron [req-d4417d5c-623e-411d-a91c-a8b415b8806b req-210fda17-2168-4096-aa8e-618c92aa5c3b service nova] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Updated VIF entry in instance network info cache for port 887d5ac0-1fec-4e07-a5fc-3e676a540d8f. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2171.167809] env[67893]: DEBUG nova.network.neutron [req-d4417d5c-623e-411d-a91c-a8b415b8806b req-210fda17-2168-4096-aa8e-618c92aa5c3b service nova] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Updating instance_info_cache with network_info: [{"id": "887d5ac0-1fec-4e07-a5fc-3e676a540d8f", "address": "fa:16:3e:47:9c:4d", "network": {"id": "b5038471-f3b2-4f1f-b2f9-62effa71f1aa", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1405799721-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7a19d9bde3814325847c06cec1af09b7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "89ef02af-c508-432f-ae29-3a219701d584", "external-id": "nsx-vlan-transportzone-313", "segmentation_id": 313, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap887d5ac0-1f", "ovs_interfaceid": "887d5ac0-1fec-4e07-a5fc-3e676a540d8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2171.177815] env[67893]: DEBUG oslo_concurrency.lockutils [req-d4417d5c-623e-411d-a91c-a8b415b8806b req-210fda17-2168-4096-aa8e-618c92aa5c3b service nova] Releasing lock "refresh_cache-d0f623f5-88e9-4806-8f30-584d277ba5fe" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2207.739565] env[67893]: WARNING oslo_vmware.rw_handles [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2207.739565] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2207.739565] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2207.739565] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2207.739565] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2207.739565] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 2207.739565] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2207.739565] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2207.739565] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2207.739565] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2207.739565] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2207.739565] env[67893]: ERROR oslo_vmware.rw_handles [ 2207.740240] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/b9e0cdf7-0d57-4d38-af05-5b819d4457bf/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2207.742079] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2207.742332] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Copying Virtual Disk [datastore1] vmware_temp/b9e0cdf7-0d57-4d38-af05-5b819d4457bf/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/b9e0cdf7-0d57-4d38-af05-5b819d4457bf/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2207.742998] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-40c1e9ea-3f9d-4174-bf36-19d6d3962e9a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2207.750948] env[67893]: DEBUG oslo_vmware.api [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Waiting for the task: (returnval){ [ 2207.750948] env[67893]: value = "task-3455507" [ 2207.750948] env[67893]: _type = "Task" [ 2207.750948] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2207.758855] env[67893]: DEBUG oslo_vmware.api [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Task: {'id': task-3455507, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2208.261898] env[67893]: DEBUG oslo_vmware.exceptions [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2208.262183] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2208.262760] env[67893]: ERROR nova.compute.manager [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2208.262760] env[67893]: Faults: ['InvalidArgument'] [ 2208.262760] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Traceback (most recent call last): [ 2208.262760] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2208.262760] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] yield resources [ 2208.262760] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2208.262760] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] self.driver.spawn(context, instance, image_meta, [ 2208.262760] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2208.262760] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2208.262760] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2208.262760] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] self._fetch_image_if_missing(context, vi) [ 2208.262760] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2208.262760] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] image_cache(vi, tmp_image_ds_loc) [ 2208.262760] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2208.262760] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] vm_util.copy_virtual_disk( [ 2208.262760] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2208.262760] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] session._wait_for_task(vmdk_copy_task) [ 2208.262760] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2208.262760] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] return self.wait_for_task(task_ref) [ 2208.262760] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2208.262760] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] return evt.wait() [ 2208.262760] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2208.262760] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] result = hub.switch() [ 2208.262760] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2208.262760] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] return self.greenlet.switch() [ 2208.262760] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2208.262760] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] self.f(*self.args, **self.kw) [ 2208.262760] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2208.262760] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] raise exceptions.translate_fault(task_info.error) [ 2208.262760] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2208.262760] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Faults: ['InvalidArgument'] [ 2208.262760] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] [ 2208.263692] env[67893]: INFO nova.compute.manager [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Terminating instance [ 2208.265198] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2208.265198] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2208.265198] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c77da16c-a475-4f34-9676-7ece056b513d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2208.267413] env[67893]: DEBUG nova.compute.manager [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2208.267623] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2208.268340] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-189b99d6-cfae-42b1-bb29-9f898217f9e3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2208.275157] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2208.275370] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-02d93014-42ba-4f86-9175-2ef211b4e0d1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2208.277477] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2208.277651] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2208.278602] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-17ea154f-99ca-4081-b4c7-4ed0e0c18a0e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2208.283333] env[67893]: DEBUG oslo_vmware.api [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Waiting for the task: (returnval){ [ 2208.283333] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52583721-6dd0-908b-f572-c8a68c40e7cd" [ 2208.283333] env[67893]: _type = "Task" [ 2208.283333] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2208.290232] env[67893]: DEBUG oslo_vmware.api [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52583721-6dd0-908b-f572-c8a68c40e7cd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2208.351243] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2208.351479] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2208.351654] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Deleting the datastore file [datastore1] 9fc9f6b0-928e-46b4-ad7c-9217b2f31575 {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2208.351938] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-36b5ea64-97f2-4ca7-bef6-86427d069ac7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2208.358015] env[67893]: DEBUG oslo_vmware.api [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Waiting for the task: (returnval){ [ 2208.358015] env[67893]: value = "task-3455509" [ 2208.358015] env[67893]: _type = "Task" [ 2208.358015] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2208.366274] env[67893]: DEBUG oslo_vmware.api [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Task: {'id': task-3455509, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2208.794400] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2208.794726] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Creating directory with path [datastore1] vmware_temp/72095c95-f494-494b-81c0-70180d0fe220/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2208.794893] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f93f5dff-4ba0-4dc8-8cc8-a1d8a117d8a0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2208.806141] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Created directory with path [datastore1] vmware_temp/72095c95-f494-494b-81c0-70180d0fe220/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2208.806329] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Fetch image to [datastore1] vmware_temp/72095c95-f494-494b-81c0-70180d0fe220/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2208.806500] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/72095c95-f494-494b-81c0-70180d0fe220/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2208.807226] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c87438dc-9f39-473b-a898-a156a2e8b420 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2208.813718] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d50a00c-8094-4a37-a1ea-8edef67a6e97 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2208.822446] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f19889ea-65ee-4d64-ad25-7ae93219ac26 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2208.853443] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5df7a380-1728-43ed-859a-551b7c80b66f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2208.861268] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-757aa37e-2024-4296-afb6-1c27f655abb7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2208.867052] env[67893]: DEBUG oslo_vmware.api [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Task: {'id': task-3455509, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.063718} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2208.867300] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2208.867495] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2208.867664] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2208.867832] env[67893]: INFO nova.compute.manager [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2208.870035] env[67893]: DEBUG nova.compute.claims [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2208.870205] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2208.870417] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2208.882857] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2208.931738] env[67893]: DEBUG oslo_vmware.rw_handles [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/72095c95-f494-494b-81c0-70180d0fe220/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2208.991371] env[67893]: DEBUG oslo_vmware.rw_handles [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2208.991371] env[67893]: DEBUG oslo_vmware.rw_handles [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/72095c95-f494-494b-81c0-70180d0fe220/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2209.076339] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48d90684-f88c-42b3-8a24-bcd611991ad8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2209.083480] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93abe5d1-1960-4f51-b57e-a7cdcee9a56d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2209.112274] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7dd717ec-693c-4c5b-9c66-60a7f256a05a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2209.118784] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4aaa6f63-23fc-452c-af32-b79517e05eee {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2209.131198] env[67893]: DEBUG nova.compute.provider_tree [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2209.139854] env[67893]: DEBUG nova.scheduler.client.report [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2209.153778] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.283s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2209.154272] env[67893]: ERROR nova.compute.manager [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2209.154272] env[67893]: Faults: ['InvalidArgument'] [ 2209.154272] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Traceback (most recent call last): [ 2209.154272] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2209.154272] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] self.driver.spawn(context, instance, image_meta, [ 2209.154272] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2209.154272] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2209.154272] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2209.154272] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] self._fetch_image_if_missing(context, vi) [ 2209.154272] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2209.154272] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] image_cache(vi, tmp_image_ds_loc) [ 2209.154272] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2209.154272] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] vm_util.copy_virtual_disk( [ 2209.154272] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2209.154272] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] session._wait_for_task(vmdk_copy_task) [ 2209.154272] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2209.154272] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] return self.wait_for_task(task_ref) [ 2209.154272] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2209.154272] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] return evt.wait() [ 2209.154272] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2209.154272] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] result = hub.switch() [ 2209.154272] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2209.154272] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] return self.greenlet.switch() [ 2209.154272] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2209.154272] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] self.f(*self.args, **self.kw) [ 2209.154272] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2209.154272] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] raise exceptions.translate_fault(task_info.error) [ 2209.154272] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2209.154272] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Faults: ['InvalidArgument'] [ 2209.154272] env[67893]: ERROR nova.compute.manager [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] [ 2209.155108] env[67893]: DEBUG nova.compute.utils [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2209.156304] env[67893]: DEBUG nova.compute.manager [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Build of instance 9fc9f6b0-928e-46b4-ad7c-9217b2f31575 was re-scheduled: A specified parameter was not correct: fileType [ 2209.156304] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2209.156667] env[67893]: DEBUG nova.compute.manager [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2209.156834] env[67893]: DEBUG nova.compute.manager [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2209.157008] env[67893]: DEBUG nova.compute.manager [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2209.157179] env[67893]: DEBUG nova.network.neutron [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2209.530446] env[67893]: DEBUG nova.network.neutron [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2209.545291] env[67893]: INFO nova.compute.manager [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Took 0.39 seconds to deallocate network for instance. [ 2209.649049] env[67893]: INFO nova.scheduler.client.report [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Deleted allocations for instance 9fc9f6b0-928e-46b4-ad7c-9217b2f31575 [ 2209.675296] env[67893]: DEBUG oslo_concurrency.lockutils [None req-a541d1ce-bee8-44cf-86ad-b6d58e1a112f tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Lock "9fc9f6b0-928e-46b4-ad7c-9217b2f31575" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 629.297s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2209.675622] env[67893]: DEBUG oslo_concurrency.lockutils [None req-e4b69b06-dc8d-4a0c-8630-110aa29c0fd4 tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Lock "9fc9f6b0-928e-46b4-ad7c-9217b2f31575" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 433.818s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2209.675861] env[67893]: DEBUG oslo_concurrency.lockutils [None req-e4b69b06-dc8d-4a0c-8630-110aa29c0fd4 tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Acquiring lock "9fc9f6b0-928e-46b4-ad7c-9217b2f31575-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2209.676085] env[67893]: DEBUG oslo_concurrency.lockutils [None req-e4b69b06-dc8d-4a0c-8630-110aa29c0fd4 tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Lock "9fc9f6b0-928e-46b4-ad7c-9217b2f31575-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2209.676262] env[67893]: DEBUG oslo_concurrency.lockutils [None req-e4b69b06-dc8d-4a0c-8630-110aa29c0fd4 tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Lock "9fc9f6b0-928e-46b4-ad7c-9217b2f31575-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2209.678690] env[67893]: INFO nova.compute.manager [None req-e4b69b06-dc8d-4a0c-8630-110aa29c0fd4 tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Terminating instance [ 2209.681010] env[67893]: DEBUG nova.compute.manager [None req-e4b69b06-dc8d-4a0c-8630-110aa29c0fd4 tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2209.681010] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-e4b69b06-dc8d-4a0c-8630-110aa29c0fd4 tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2209.681166] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d6f401ba-61e8-4d56-bdfa-d529f294acc7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2209.692859] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6111e6d7-ae61-47b4-8c2f-25d447f72e84 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2209.721111] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-e4b69b06-dc8d-4a0c-8630-110aa29c0fd4 tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 9fc9f6b0-928e-46b4-ad7c-9217b2f31575 could not be found. [ 2209.721339] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-e4b69b06-dc8d-4a0c-8630-110aa29c0fd4 tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2209.721516] env[67893]: INFO nova.compute.manager [None req-e4b69b06-dc8d-4a0c-8630-110aa29c0fd4 tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2209.721762] env[67893]: DEBUG oslo.service.loopingcall [None req-e4b69b06-dc8d-4a0c-8630-110aa29c0fd4 tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2209.722033] env[67893]: DEBUG nova.compute.manager [-] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2209.722135] env[67893]: DEBUG nova.network.neutron [-] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2209.745796] env[67893]: DEBUG nova.network.neutron [-] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2209.753977] env[67893]: INFO nova.compute.manager [-] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] Took 0.03 seconds to deallocate network for instance. [ 2209.842637] env[67893]: DEBUG oslo_concurrency.lockutils [None req-e4b69b06-dc8d-4a0c-8630-110aa29c0fd4 tempest-ServerActionsTestOtherB-66905256 tempest-ServerActionsTestOtherB-66905256-project-member] Lock "9fc9f6b0-928e-46b4-ad7c-9217b2f31575" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.167s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2209.843521] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "9fc9f6b0-928e-46b4-ad7c-9217b2f31575" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 405.471s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2209.843712] env[67893]: INFO nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 9fc9f6b0-928e-46b4-ad7c-9217b2f31575] During sync_power_state the instance has a pending task (deleting). Skip. [ 2209.843881] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "9fc9f6b0-928e-46b4-ad7c-9217b2f31575" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2217.171848] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2217.859217] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2217.859474] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2218.854974] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2218.858579] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2218.858764] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2218.858856] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2218.877796] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2218.877957] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2218.878101] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 11000d92-0094-4561-a807-ca76610ea549] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2218.878231] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2218.878356] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2218.878477] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2218.878597] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2218.878717] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2218.878835] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2221.859340] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2221.859637] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2222.858812] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2223.579942] env[67893]: DEBUG oslo_concurrency.lockutils [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquiring lock "878bdb96-bd90-47ab-904b-ce1d184ecc72" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2223.580273] env[67893]: DEBUG oslo_concurrency.lockutils [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "878bdb96-bd90-47ab-904b-ce1d184ecc72" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2223.591934] env[67893]: DEBUG nova.compute.manager [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 878bdb96-bd90-47ab-904b-ce1d184ecc72] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2223.639372] env[67893]: DEBUG oslo_concurrency.lockutils [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2223.639615] env[67893]: DEBUG oslo_concurrency.lockutils [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2223.641118] env[67893]: INFO nova.compute.claims [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 878bdb96-bd90-47ab-904b-ce1d184ecc72] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2223.793034] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d57f89c4-0c1d-4c18-9c2e-f3319689174f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2223.800517] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8d9eda9-dede-4170-905a-995c2eb7d6ba {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2223.829883] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28230bb5-fba4-488f-b2c2-e57a5221517c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2223.836706] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0e76355-eaf1-4a0a-abf1-9edec57ecf02 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2223.849573] env[67893]: DEBUG nova.compute.provider_tree [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2223.858291] env[67893]: DEBUG nova.scheduler.client.report [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2223.860996] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2223.872708] env[67893]: DEBUG oslo_concurrency.lockutils [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.232s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2223.872708] env[67893]: DEBUG nova.compute.manager [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 878bdb96-bd90-47ab-904b-ce1d184ecc72] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2223.906042] env[67893]: DEBUG nova.compute.utils [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2223.906987] env[67893]: DEBUG nova.compute.manager [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 878bdb96-bd90-47ab-904b-ce1d184ecc72] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2223.908146] env[67893]: DEBUG nova.network.neutron [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 878bdb96-bd90-47ab-904b-ce1d184ecc72] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2223.917884] env[67893]: DEBUG nova.compute.manager [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 878bdb96-bd90-47ab-904b-ce1d184ecc72] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2223.975849] env[67893]: DEBUG nova.policy [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '894285baafaf410ea301f676b78c45f4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b439a6039a714a6fabd3c0477629d3c1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 2223.989844] env[67893]: DEBUG nova.compute.manager [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 878bdb96-bd90-47ab-904b-ce1d184ecc72] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2224.015053] env[67893]: DEBUG nova.virt.hardware [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2224.015384] env[67893]: DEBUG nova.virt.hardware [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2224.015618] env[67893]: DEBUG nova.virt.hardware [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2224.015890] env[67893]: DEBUG nova.virt.hardware [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2224.016124] env[67893]: DEBUG nova.virt.hardware [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2224.016346] env[67893]: DEBUG nova.virt.hardware [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2224.016646] env[67893]: DEBUG nova.virt.hardware [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2224.016892] env[67893]: DEBUG nova.virt.hardware [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2224.017163] env[67893]: DEBUG nova.virt.hardware [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2224.017410] env[67893]: DEBUG nova.virt.hardware [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2224.017662] env[67893]: DEBUG nova.virt.hardware [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2224.018840] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e892a54d-edf6-4399-9c69-db30c49c7b27 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2224.029508] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c281aee0-a67d-4e87-9668-2010562259a5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2224.284835] env[67893]: DEBUG nova.network.neutron [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 878bdb96-bd90-47ab-904b-ce1d184ecc72] Successfully created port: 7a9a3009-0edb-45af-b747-fce70f3d4e6e {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2224.869415] env[67893]: DEBUG nova.compute.manager [req-ab3d8a5c-50b5-4fb6-a955-bde0f68e6147 req-604af053-5cd9-44c6-be7a-324e465bae10 service nova] [instance: 878bdb96-bd90-47ab-904b-ce1d184ecc72] Received event network-vif-plugged-7a9a3009-0edb-45af-b747-fce70f3d4e6e {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2224.869662] env[67893]: DEBUG oslo_concurrency.lockutils [req-ab3d8a5c-50b5-4fb6-a955-bde0f68e6147 req-604af053-5cd9-44c6-be7a-324e465bae10 service nova] Acquiring lock "878bdb96-bd90-47ab-904b-ce1d184ecc72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2224.869842] env[67893]: DEBUG oslo_concurrency.lockutils [req-ab3d8a5c-50b5-4fb6-a955-bde0f68e6147 req-604af053-5cd9-44c6-be7a-324e465bae10 service nova] Lock "878bdb96-bd90-47ab-904b-ce1d184ecc72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2224.870015] env[67893]: DEBUG oslo_concurrency.lockutils [req-ab3d8a5c-50b5-4fb6-a955-bde0f68e6147 req-604af053-5cd9-44c6-be7a-324e465bae10 service nova] Lock "878bdb96-bd90-47ab-904b-ce1d184ecc72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2224.870186] env[67893]: DEBUG nova.compute.manager [req-ab3d8a5c-50b5-4fb6-a955-bde0f68e6147 req-604af053-5cd9-44c6-be7a-324e465bae10 service nova] [instance: 878bdb96-bd90-47ab-904b-ce1d184ecc72] No waiting events found dispatching network-vif-plugged-7a9a3009-0edb-45af-b747-fce70f3d4e6e {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2224.870377] env[67893]: WARNING nova.compute.manager [req-ab3d8a5c-50b5-4fb6-a955-bde0f68e6147 req-604af053-5cd9-44c6-be7a-324e465bae10 service nova] [instance: 878bdb96-bd90-47ab-904b-ce1d184ecc72] Received unexpected event network-vif-plugged-7a9a3009-0edb-45af-b747-fce70f3d4e6e for instance with vm_state building and task_state spawning. [ 2224.957618] env[67893]: DEBUG nova.network.neutron [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 878bdb96-bd90-47ab-904b-ce1d184ecc72] Successfully updated port: 7a9a3009-0edb-45af-b747-fce70f3d4e6e {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2224.968859] env[67893]: DEBUG oslo_concurrency.lockutils [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquiring lock "refresh_cache-878bdb96-bd90-47ab-904b-ce1d184ecc72" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2224.969067] env[67893]: DEBUG oslo_concurrency.lockutils [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquired lock "refresh_cache-878bdb96-bd90-47ab-904b-ce1d184ecc72" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2224.969224] env[67893]: DEBUG nova.network.neutron [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 878bdb96-bd90-47ab-904b-ce1d184ecc72] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2225.004838] env[67893]: DEBUG nova.network.neutron [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 878bdb96-bd90-47ab-904b-ce1d184ecc72] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2225.160410] env[67893]: DEBUG nova.network.neutron [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 878bdb96-bd90-47ab-904b-ce1d184ecc72] Updating instance_info_cache with network_info: [{"id": "7a9a3009-0edb-45af-b747-fce70f3d4e6e", "address": "fa:16:3e:3f:52:f2", "network": {"id": "3269c624-7a70-494c-85bc-8230ffbbab83", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-740576182-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b439a6039a714a6fabd3c0477629d3c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7a9a3009-0e", "ovs_interfaceid": "7a9a3009-0edb-45af-b747-fce70f3d4e6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2225.170886] env[67893]: DEBUG oslo_concurrency.lockutils [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Releasing lock "refresh_cache-878bdb96-bd90-47ab-904b-ce1d184ecc72" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2225.171168] env[67893]: DEBUG nova.compute.manager [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 878bdb96-bd90-47ab-904b-ce1d184ecc72] Instance network_info: |[{"id": "7a9a3009-0edb-45af-b747-fce70f3d4e6e", "address": "fa:16:3e:3f:52:f2", "network": {"id": "3269c624-7a70-494c-85bc-8230ffbbab83", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-740576182-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b439a6039a714a6fabd3c0477629d3c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7a9a3009-0e", "ovs_interfaceid": "7a9a3009-0edb-45af-b747-fce70f3d4e6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2225.171536] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 878bdb96-bd90-47ab-904b-ce1d184ecc72] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:3f:52:f2', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'fa01fe1a-83b6-4c10-af75-00ddb17f9bbf', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7a9a3009-0edb-45af-b747-fce70f3d4e6e', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2225.179216] env[67893]: DEBUG oslo.service.loopingcall [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2225.179643] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 878bdb96-bd90-47ab-904b-ce1d184ecc72] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2225.179872] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-05c6d73f-d1fb-47cc-b353-06e0b8fe68cd {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2225.201188] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2225.201188] env[67893]: value = "task-3455510" [ 2225.201188] env[67893]: _type = "Task" [ 2225.201188] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2225.208855] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455510, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2225.712029] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455510, 'name': CreateVM_Task, 'duration_secs': 0.301362} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2225.712029] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 878bdb96-bd90-47ab-904b-ce1d184ecc72] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2225.712226] env[67893]: DEBUG oslo_concurrency.lockutils [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2225.712346] env[67893]: DEBUG oslo_concurrency.lockutils [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2225.712740] env[67893]: DEBUG oslo_concurrency.lockutils [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2225.712976] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bd49c972-4559-4c9b-ba12-407f31d3ab6a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2225.717083] env[67893]: DEBUG oslo_vmware.api [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Waiting for the task: (returnval){ [ 2225.717083] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52f1cb70-7393-6e70-1294-d124d6e907f5" [ 2225.717083] env[67893]: _type = "Task" [ 2225.717083] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2225.724268] env[67893]: DEBUG oslo_vmware.api [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52f1cb70-7393-6e70-1294-d124d6e907f5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2226.227903] env[67893]: DEBUG oslo_concurrency.lockutils [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2226.228240] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 878bdb96-bd90-47ab-904b-ce1d184ecc72] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2226.228342] env[67893]: DEBUG oslo_concurrency.lockutils [None req-634aa28d-8abf-45e3-a0a7-4c76f37582a7 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2226.859638] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2226.872544] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2226.872544] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2226.872544] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2226.872544] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2226.873035] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c10420c-8299-41ca-b9f7-679fa27666f0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2226.881515] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd7cca8c-9ccc-4270-86b0-ffb233e2888f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2226.896719] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3e1a5a6-c085-4369-a1a8-e108677f3f44 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2226.904498] env[67893]: DEBUG nova.compute.manager [req-49357c18-ea38-454c-8726-996a57389d98 req-194880e9-5f60-48bb-98bb-181256bda1a2 service nova] [instance: 878bdb96-bd90-47ab-904b-ce1d184ecc72] Received event network-changed-7a9a3009-0edb-45af-b747-fce70f3d4e6e {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2226.904742] env[67893]: DEBUG nova.compute.manager [req-49357c18-ea38-454c-8726-996a57389d98 req-194880e9-5f60-48bb-98bb-181256bda1a2 service nova] [instance: 878bdb96-bd90-47ab-904b-ce1d184ecc72] Refreshing instance network info cache due to event network-changed-7a9a3009-0edb-45af-b747-fce70f3d4e6e. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2226.904961] env[67893]: DEBUG oslo_concurrency.lockutils [req-49357c18-ea38-454c-8726-996a57389d98 req-194880e9-5f60-48bb-98bb-181256bda1a2 service nova] Acquiring lock "refresh_cache-878bdb96-bd90-47ab-904b-ce1d184ecc72" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2226.905122] env[67893]: DEBUG oslo_concurrency.lockutils [req-49357c18-ea38-454c-8726-996a57389d98 req-194880e9-5f60-48bb-98bb-181256bda1a2 service nova] Acquired lock "refresh_cache-878bdb96-bd90-47ab-904b-ce1d184ecc72" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2226.905284] env[67893]: DEBUG nova.network.neutron [req-49357c18-ea38-454c-8726-996a57389d98 req-194880e9-5f60-48bb-98bb-181256bda1a2 service nova] [instance: 878bdb96-bd90-47ab-904b-ce1d184ecc72] Refreshing network info cache for port 7a9a3009-0edb-45af-b747-fce70f3d4e6e {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2226.907010] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e40fb2c8-5d5e-4f20-b606-b95294a2dccd {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2226.938296] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180988MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2226.938530] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2226.938819] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2227.008219] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 15893b5f-a02a-4ce7-80c9-eea0658f9ac7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2227.008388] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 94760898-4f3c-4f41-85be-366f4108d0ba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2227.008518] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 11000d92-0094-4561-a807-ca76610ea549 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2227.008641] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 7169c720-f69e-40a3-95d2-473639884cd9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2227.008762] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 72410dc2-74d9-4d59-bdd1-ad45b01c482b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2227.008878] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a5151a22-4174-4f66-a83a-55a0dd01c407 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2227.008995] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a71c2ee1-0286-4098-afca-f7666469a95f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2227.009164] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance d0f623f5-88e9-4806-8f30-584d277ba5fe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2227.009283] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 878bdb96-bd90-47ab-904b-ce1d184ecc72 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2227.009460] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2227.009595] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2227.116663] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c85ba01-efb4-4c19-aca7-ee1c662f4ae6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2227.122997] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7680c850-f1f3-4b56-86c7-defc3e936330 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2227.154546] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86665a84-bfe8-4ddd-a90e-007eb7e32cbe {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2227.162067] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d1c652b-f6a5-42da-8bb7-a00ac588c5ef {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2227.173987] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2227.175603] env[67893]: DEBUG nova.network.neutron [req-49357c18-ea38-454c-8726-996a57389d98 req-194880e9-5f60-48bb-98bb-181256bda1a2 service nova] [instance: 878bdb96-bd90-47ab-904b-ce1d184ecc72] Updated VIF entry in instance network info cache for port 7a9a3009-0edb-45af-b747-fce70f3d4e6e. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2227.175956] env[67893]: DEBUG nova.network.neutron [req-49357c18-ea38-454c-8726-996a57389d98 req-194880e9-5f60-48bb-98bb-181256bda1a2 service nova] [instance: 878bdb96-bd90-47ab-904b-ce1d184ecc72] Updating instance_info_cache with network_info: [{"id": "7a9a3009-0edb-45af-b747-fce70f3d4e6e", "address": "fa:16:3e:3f:52:f2", "network": {"id": "3269c624-7a70-494c-85bc-8230ffbbab83", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-740576182-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b439a6039a714a6fabd3c0477629d3c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fa01fe1a-83b6-4c10-af75-00ddb17f9bbf", "external-id": "nsx-vlan-transportzone-431", "segmentation_id": 431, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7a9a3009-0e", "ovs_interfaceid": "7a9a3009-0edb-45af-b747-fce70f3d4e6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2227.181795] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2227.184845] env[67893]: DEBUG oslo_concurrency.lockutils [req-49357c18-ea38-454c-8726-996a57389d98 req-194880e9-5f60-48bb-98bb-181256bda1a2 service nova] Releasing lock "refresh_cache-878bdb96-bd90-47ab-904b-ce1d184ecc72" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2227.199041] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2227.199245] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.260s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2255.138077] env[67893]: WARNING oslo_vmware.rw_handles [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2255.138077] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2255.138077] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2255.138077] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2255.138077] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2255.138077] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 2255.138077] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2255.138077] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2255.138077] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2255.138077] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2255.138077] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2255.138077] env[67893]: ERROR oslo_vmware.rw_handles [ 2255.138799] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/72095c95-f494-494b-81c0-70180d0fe220/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2255.140774] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2255.141028] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Copying Virtual Disk [datastore1] vmware_temp/72095c95-f494-494b-81c0-70180d0fe220/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/72095c95-f494-494b-81c0-70180d0fe220/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2255.141336] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-13e1a3ab-6ea2-4fbd-ba2c-75896b89ef80 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2255.150135] env[67893]: DEBUG oslo_vmware.api [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Waiting for the task: (returnval){ [ 2255.150135] env[67893]: value = "task-3455511" [ 2255.150135] env[67893]: _type = "Task" [ 2255.150135] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2255.158065] env[67893]: DEBUG oslo_vmware.api [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Task: {'id': task-3455511, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2255.660475] env[67893]: DEBUG oslo_vmware.exceptions [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2255.660768] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2255.661360] env[67893]: ERROR nova.compute.manager [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2255.661360] env[67893]: Faults: ['InvalidArgument'] [ 2255.661360] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Traceback (most recent call last): [ 2255.661360] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2255.661360] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] yield resources [ 2255.661360] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2255.661360] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] self.driver.spawn(context, instance, image_meta, [ 2255.661360] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2255.661360] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2255.661360] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2255.661360] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] self._fetch_image_if_missing(context, vi) [ 2255.661360] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2255.661360] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] image_cache(vi, tmp_image_ds_loc) [ 2255.661360] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2255.661360] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] vm_util.copy_virtual_disk( [ 2255.661360] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2255.661360] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] session._wait_for_task(vmdk_copy_task) [ 2255.661360] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2255.661360] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] return self.wait_for_task(task_ref) [ 2255.661360] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2255.661360] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] return evt.wait() [ 2255.661360] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2255.661360] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] result = hub.switch() [ 2255.661360] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2255.661360] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] return self.greenlet.switch() [ 2255.661360] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2255.661360] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] self.f(*self.args, **self.kw) [ 2255.661360] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2255.661360] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] raise exceptions.translate_fault(task_info.error) [ 2255.661360] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2255.661360] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Faults: ['InvalidArgument'] [ 2255.661360] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] [ 2255.662234] env[67893]: INFO nova.compute.manager [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Terminating instance [ 2255.663234] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2255.663458] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2255.663713] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3f993f48-c340-4889-9e23-155eba7ac6bf {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2255.665852] env[67893]: DEBUG nova.compute.manager [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2255.666067] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2255.666797] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09fccc0e-83a0-47ac-9ac1-c8c3915b680c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2255.673923] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2255.674941] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6a669a6e-acc5-4820-b2bd-efce3024497b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2255.677271] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2255.677271] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2255.677271] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-24bdbb93-a546-4b30-964a-d0f2adbace6f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2255.682157] env[67893]: DEBUG oslo_vmware.api [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Waiting for the task: (returnval){ [ 2255.682157] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52fea88e-dde9-e31a-0296-9147a180eef9" [ 2255.682157] env[67893]: _type = "Task" [ 2255.682157] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2255.689144] env[67893]: DEBUG oslo_vmware.api [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52fea88e-dde9-e31a-0296-9147a180eef9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2255.751061] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2255.751301] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2255.751453] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Deleting the datastore file [datastore1] 15893b5f-a02a-4ce7-80c9-eea0658f9ac7 {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2255.751708] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b0e10756-ae8e-4c72-8e00-081968a12c25 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2255.765989] env[67893]: DEBUG oslo_vmware.api [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Waiting for the task: (returnval){ [ 2255.765989] env[67893]: value = "task-3455513" [ 2255.765989] env[67893]: _type = "Task" [ 2255.765989] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2255.773072] env[67893]: DEBUG oslo_vmware.api [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Task: {'id': task-3455513, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2256.193821] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2256.194176] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Creating directory with path [datastore1] vmware_temp/8d63cdc3-755d-44ae-a325-1154bea16ba2/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2256.194399] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4d9ef7fe-0142-4f96-a435-a1fa5f569225 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2256.207393] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Created directory with path [datastore1] vmware_temp/8d63cdc3-755d-44ae-a325-1154bea16ba2/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2256.207601] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Fetch image to [datastore1] vmware_temp/8d63cdc3-755d-44ae-a325-1154bea16ba2/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2256.207794] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/8d63cdc3-755d-44ae-a325-1154bea16ba2/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2256.208588] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba95a015-aec2-45ca-961b-37fbef88a8bf {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2256.215423] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60cf98be-f5d0-49f9-ad60-7adfc3974616 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2256.225283] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e509607-c9c1-4727-a069-4613080f169b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2256.258796] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c8b1597-1e26-4520-8147-8d0925433f54 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2256.264678] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-fa345acd-dd13-4d22-94cf-b09bf429ed83 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2256.274011] env[67893]: DEBUG oslo_vmware.api [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Task: {'id': task-3455513, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.06753} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2256.274308] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2256.274528] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2256.274743] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2256.274955] env[67893]: INFO nova.compute.manager [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Took 0.61 seconds to destroy the instance on the hypervisor. [ 2256.277175] env[67893]: DEBUG nova.compute.claims [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2256.277371] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2256.277622] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2256.288992] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2256.339966] env[67893]: DEBUG oslo_vmware.rw_handles [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8d63cdc3-755d-44ae-a325-1154bea16ba2/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2256.400357] env[67893]: DEBUG oslo_vmware.rw_handles [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2256.400498] env[67893]: DEBUG oslo_vmware.rw_handles [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8d63cdc3-755d-44ae-a325-1154bea16ba2/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2256.482384] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f16ea6d0-98bc-4387-aff7-2d2a1b007f5f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2256.490571] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6428a1e-c5a5-4c1e-b118-a9c3f2dd88fc {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2256.521684] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7dcf949-691d-4f58-ac32-e6952981a01b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2256.528886] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2129a8c7-4d7a-4c07-97ef-42a8f93ff388 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2256.541797] env[67893]: DEBUG nova.compute.provider_tree [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2256.550246] env[67893]: DEBUG nova.scheduler.client.report [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2256.564140] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.286s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2256.564662] env[67893]: ERROR nova.compute.manager [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2256.564662] env[67893]: Faults: ['InvalidArgument'] [ 2256.564662] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Traceback (most recent call last): [ 2256.564662] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2256.564662] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] self.driver.spawn(context, instance, image_meta, [ 2256.564662] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2256.564662] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2256.564662] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2256.564662] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] self._fetch_image_if_missing(context, vi) [ 2256.564662] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2256.564662] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] image_cache(vi, tmp_image_ds_loc) [ 2256.564662] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2256.564662] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] vm_util.copy_virtual_disk( [ 2256.564662] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2256.564662] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] session._wait_for_task(vmdk_copy_task) [ 2256.564662] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2256.564662] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] return self.wait_for_task(task_ref) [ 2256.564662] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2256.564662] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] return evt.wait() [ 2256.564662] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2256.564662] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] result = hub.switch() [ 2256.564662] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2256.564662] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] return self.greenlet.switch() [ 2256.564662] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2256.564662] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] self.f(*self.args, **self.kw) [ 2256.564662] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2256.564662] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] raise exceptions.translate_fault(task_info.error) [ 2256.564662] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2256.564662] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Faults: ['InvalidArgument'] [ 2256.564662] env[67893]: ERROR nova.compute.manager [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] [ 2256.565474] env[67893]: DEBUG nova.compute.utils [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2256.566751] env[67893]: DEBUG nova.compute.manager [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Build of instance 15893b5f-a02a-4ce7-80c9-eea0658f9ac7 was re-scheduled: A specified parameter was not correct: fileType [ 2256.566751] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2256.567127] env[67893]: DEBUG nova.compute.manager [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2256.567304] env[67893]: DEBUG nova.compute.manager [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2256.567507] env[67893]: DEBUG nova.compute.manager [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2256.567671] env[67893]: DEBUG nova.network.neutron [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2257.007301] env[67893]: DEBUG nova.network.neutron [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2257.021476] env[67893]: INFO nova.compute.manager [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Took 0.45 seconds to deallocate network for instance. [ 2257.122956] env[67893]: INFO nova.scheduler.client.report [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Deleted allocations for instance 15893b5f-a02a-4ce7-80c9-eea0658f9ac7 [ 2257.142692] env[67893]: DEBUG oslo_concurrency.lockutils [None req-3dd0b901-6a22-47f0-95c0-3b62f609bc08 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Lock "15893b5f-a02a-4ce7-80c9-eea0658f9ac7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 610.260s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2257.142964] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "15893b5f-a02a-4ce7-80c9-eea0658f9ac7" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 452.770s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2257.143232] env[67893]: INFO nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] During sync_power_state the instance has a pending task (spawning). Skip. [ 2257.143342] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "15893b5f-a02a-4ce7-80c9-eea0658f9ac7" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2257.143814] env[67893]: DEBUG oslo_concurrency.lockutils [None req-cdbedd38-46d7-4dba-8be4-354990c429f5 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Lock "15893b5f-a02a-4ce7-80c9-eea0658f9ac7" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 413.422s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2257.144059] env[67893]: DEBUG oslo_concurrency.lockutils [None req-cdbedd38-46d7-4dba-8be4-354990c429f5 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Acquiring lock "15893b5f-a02a-4ce7-80c9-eea0658f9ac7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2257.144269] env[67893]: DEBUG oslo_concurrency.lockutils [None req-cdbedd38-46d7-4dba-8be4-354990c429f5 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Lock "15893b5f-a02a-4ce7-80c9-eea0658f9ac7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2257.144432] env[67893]: DEBUG oslo_concurrency.lockutils [None req-cdbedd38-46d7-4dba-8be4-354990c429f5 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Lock "15893b5f-a02a-4ce7-80c9-eea0658f9ac7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2257.146736] env[67893]: INFO nova.compute.manager [None req-cdbedd38-46d7-4dba-8be4-354990c429f5 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Terminating instance [ 2257.148465] env[67893]: DEBUG nova.compute.manager [None req-cdbedd38-46d7-4dba-8be4-354990c429f5 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2257.148660] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-cdbedd38-46d7-4dba-8be4-354990c429f5 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2257.148921] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-72cffe7d-0632-43d3-93ce-0b3af8192536 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2257.158839] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f16c767-55fa-4765-a581-3abae2b1b448 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2257.187027] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-cdbedd38-46d7-4dba-8be4-354990c429f5 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 15893b5f-a02a-4ce7-80c9-eea0658f9ac7 could not be found. [ 2257.187027] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-cdbedd38-46d7-4dba-8be4-354990c429f5 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2257.187027] env[67893]: INFO nova.compute.manager [None req-cdbedd38-46d7-4dba-8be4-354990c429f5 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2257.187027] env[67893]: DEBUG oslo.service.loopingcall [None req-cdbedd38-46d7-4dba-8be4-354990c429f5 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2257.187027] env[67893]: DEBUG nova.compute.manager [-] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2257.187244] env[67893]: DEBUG nova.network.neutron [-] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2257.213805] env[67893]: DEBUG nova.network.neutron [-] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2257.222560] env[67893]: INFO nova.compute.manager [-] [instance: 15893b5f-a02a-4ce7-80c9-eea0658f9ac7] Took 0.04 seconds to deallocate network for instance. [ 2257.312872] env[67893]: DEBUG oslo_concurrency.lockutils [None req-cdbedd38-46d7-4dba-8be4-354990c429f5 tempest-AttachVolumeNegativeTest-2014474775 tempest-AttachVolumeNegativeTest-2014474775-project-member] Lock "15893b5f-a02a-4ce7-80c9-eea0658f9ac7" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.169s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2274.629887] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ff79d8bf-c1b1-4927-a95a-b861689812be tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquiring lock "a71c2ee1-0286-4098-afca-f7666469a95f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2278.200235] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2278.200616] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2278.200661] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2279.860569] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2279.860959] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2279.860959] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2279.880635] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2279.880797] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 11000d92-0094-4561-a807-ca76610ea549] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2279.880932] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2279.881073] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2279.881202] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2279.881321] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2279.881440] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2279.881592] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 878bdb96-bd90-47ab-904b-ce1d184ecc72] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2279.881679] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2280.876018] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2281.859604] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2281.859604] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2284.859782] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2284.860103] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2285.855085] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2288.859277] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2288.870703] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2288.870918] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2288.871090] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2288.871244] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2288.872330] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d7c593b-dc10-4729-ac39-d69fe59a4dd7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2288.880862] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fde89b0d-b23f-4e0b-adf5-2c262f0c735f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2288.894024] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fbb8351-c14c-4084-a79f-51e4305370da {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2288.899773] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1167987c-5b6b-42fa-8b3d-674529873c75 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2288.928675] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180994MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2288.928807] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2288.928990] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2288.996032] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 94760898-4f3c-4f41-85be-366f4108d0ba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2288.996032] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 11000d92-0094-4561-a807-ca76610ea549 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2288.996032] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 7169c720-f69e-40a3-95d2-473639884cd9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2288.996032] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 72410dc2-74d9-4d59-bdd1-ad45b01c482b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2288.996032] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a5151a22-4174-4f66-a83a-55a0dd01c407 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2288.996248] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a71c2ee1-0286-4098-afca-f7666469a95f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2288.996248] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance d0f623f5-88e9-4806-8f30-584d277ba5fe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2288.996355] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 878bdb96-bd90-47ab-904b-ce1d184ecc72 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2288.996461] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2288.996599] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2289.095698] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09ff7905-0da4-4e4d-9ee2-e4d71a16c1fc {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.104058] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e8518f7-9b21-4930-a93f-6447b3e73d35 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.132648] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed258eac-ddc3-484e-9cae-83efb34271a7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.139290] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2684907-1d72-496d-b41c-381b6fc9410c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.152991] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2289.160854] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2289.175643] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2289.175820] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.247s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2306.149916] env[67893]: WARNING oslo_vmware.rw_handles [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2306.149916] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2306.149916] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2306.149916] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2306.149916] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2306.149916] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 2306.149916] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2306.149916] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2306.149916] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2306.149916] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2306.149916] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2306.149916] env[67893]: ERROR oslo_vmware.rw_handles [ 2306.150617] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/8d63cdc3-755d-44ae-a325-1154bea16ba2/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2306.152396] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2306.152633] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Copying Virtual Disk [datastore1] vmware_temp/8d63cdc3-755d-44ae-a325-1154bea16ba2/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/8d63cdc3-755d-44ae-a325-1154bea16ba2/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2306.153361] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-35881308-5be0-4eab-a3af-a6784ac790b6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2306.161200] env[67893]: DEBUG oslo_vmware.api [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Waiting for the task: (returnval){ [ 2306.161200] env[67893]: value = "task-3455514" [ 2306.161200] env[67893]: _type = "Task" [ 2306.161200] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2306.168841] env[67893]: DEBUG oslo_vmware.api [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Task: {'id': task-3455514, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2306.671617] env[67893]: DEBUG oslo_vmware.exceptions [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2306.671906] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2306.672468] env[67893]: ERROR nova.compute.manager [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2306.672468] env[67893]: Faults: ['InvalidArgument'] [ 2306.672468] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Traceback (most recent call last): [ 2306.672468] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2306.672468] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] yield resources [ 2306.672468] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2306.672468] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] self.driver.spawn(context, instance, image_meta, [ 2306.672468] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2306.672468] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2306.672468] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2306.672468] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] self._fetch_image_if_missing(context, vi) [ 2306.672468] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2306.672468] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] image_cache(vi, tmp_image_ds_loc) [ 2306.672468] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2306.672468] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] vm_util.copy_virtual_disk( [ 2306.672468] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2306.672468] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] session._wait_for_task(vmdk_copy_task) [ 2306.672468] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2306.672468] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] return self.wait_for_task(task_ref) [ 2306.672468] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2306.672468] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] return evt.wait() [ 2306.672468] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2306.672468] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] result = hub.switch() [ 2306.672468] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2306.672468] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] return self.greenlet.switch() [ 2306.672468] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2306.672468] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] self.f(*self.args, **self.kw) [ 2306.672468] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2306.672468] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] raise exceptions.translate_fault(task_info.error) [ 2306.672468] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2306.672468] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Faults: ['InvalidArgument'] [ 2306.672468] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] [ 2306.673424] env[67893]: INFO nova.compute.manager [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Terminating instance [ 2306.674347] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2306.674547] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2306.674776] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-27e0de6c-6e53-4829-943a-ed8844a1de71 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2306.676994] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Acquiring lock "refresh_cache-94760898-4f3c-4f41-85be-366f4108d0ba" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2306.677164] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Acquired lock "refresh_cache-94760898-4f3c-4f41-85be-366f4108d0ba" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2306.677326] env[67893]: DEBUG nova.network.neutron [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2306.683923] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2306.684109] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2306.685247] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e4e62667-c7a0-43be-b9ad-a77490547aff {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2306.692307] env[67893]: DEBUG oslo_vmware.api [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Waiting for the task: (returnval){ [ 2306.692307] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52822756-0bfd-e1b2-a5c8-95b3e66080c9" [ 2306.692307] env[67893]: _type = "Task" [ 2306.692307] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2306.700936] env[67893]: DEBUG oslo_vmware.api [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52822756-0bfd-e1b2-a5c8-95b3e66080c9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2306.742785] env[67893]: DEBUG nova.network.neutron [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2306.805388] env[67893]: DEBUG nova.network.neutron [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2306.814361] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Releasing lock "refresh_cache-94760898-4f3c-4f41-85be-366f4108d0ba" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2306.814759] env[67893]: DEBUG nova.compute.manager [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2306.814956] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2306.816057] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f670edd-fa39-43ac-89d8-c43bed7b6851 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2306.826018] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2306.826018] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6b93e43e-5141-4ba3-a104-6123706483bc {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2306.859348] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2306.860130] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2306.860130] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Deleting the datastore file [datastore1] 94760898-4f3c-4f41-85be-366f4108d0ba {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2306.860130] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-fe1152d2-0c08-429d-8861-3de8d5cbb716 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2306.866045] env[67893]: DEBUG oslo_vmware.api [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Waiting for the task: (returnval){ [ 2306.866045] env[67893]: value = "task-3455516" [ 2306.866045] env[67893]: _type = "Task" [ 2306.866045] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2306.873326] env[67893]: DEBUG oslo_vmware.api [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Task: {'id': task-3455516, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2307.202609] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2307.206829] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Creating directory with path [datastore1] vmware_temp/a3967cdb-39b1-48a0-80b5-7133eaae3454/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2307.206829] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-18834975-037b-4989-94e6-9250556897da {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2307.214430] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Created directory with path [datastore1] vmware_temp/a3967cdb-39b1-48a0-80b5-7133eaae3454/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2307.214618] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Fetch image to [datastore1] vmware_temp/a3967cdb-39b1-48a0-80b5-7133eaae3454/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2307.214788] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/a3967cdb-39b1-48a0-80b5-7133eaae3454/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2307.215491] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8dcb1dae-342e-4c9e-bc10-72bac5de869c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2307.221725] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8e60841-0fd7-48bd-936f-613f55ce7218 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2307.230295] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-768e421f-5fef-4108-b949-3ff238ff90c6 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2307.261387] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-227cbaeb-1534-4010-a891-c5bc1b8829bb {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2307.266525] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5b17f39d-4179-46e6-a4ac-2f9cb85f82fe {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2307.286086] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2307.333527] env[67893]: DEBUG oslo_vmware.rw_handles [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a3967cdb-39b1-48a0-80b5-7133eaae3454/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2307.392527] env[67893]: DEBUG oslo_vmware.rw_handles [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2307.392731] env[67893]: DEBUG oslo_vmware.rw_handles [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a3967cdb-39b1-48a0-80b5-7133eaae3454/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2307.396703] env[67893]: DEBUG oslo_vmware.api [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Task: {'id': task-3455516, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.037216} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2307.396945] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2307.397149] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2307.397339] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2307.397526] env[67893]: INFO nova.compute.manager [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Took 0.58 seconds to destroy the instance on the hypervisor. [ 2307.397766] env[67893]: DEBUG oslo.service.loopingcall [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2307.397976] env[67893]: DEBUG nova.compute.manager [-] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Skipping network deallocation for instance since networking was not requested. {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 2307.400197] env[67893]: DEBUG nova.compute.claims [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2307.400382] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2307.400615] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2307.543792] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a25d7a3-c913-4530-bba4-b2e742417b5f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2307.551331] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ee8b744-538d-4b21-a9dd-1a80791fc787 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2307.581649] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40481288-53cb-4ada-948c-b3d3a272b06e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2307.588419] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f16137e7-ec74-43f0-a4d0-337a69fad950 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2307.600837] env[67893]: DEBUG nova.compute.provider_tree [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2307.609161] env[67893]: DEBUG nova.scheduler.client.report [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2307.622684] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.222s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2307.622816] env[67893]: ERROR nova.compute.manager [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2307.622816] env[67893]: Faults: ['InvalidArgument'] [ 2307.622816] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Traceback (most recent call last): [ 2307.622816] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2307.622816] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] self.driver.spawn(context, instance, image_meta, [ 2307.622816] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2307.622816] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2307.622816] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2307.622816] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] self._fetch_image_if_missing(context, vi) [ 2307.622816] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2307.622816] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] image_cache(vi, tmp_image_ds_loc) [ 2307.622816] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2307.622816] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] vm_util.copy_virtual_disk( [ 2307.622816] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2307.622816] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] session._wait_for_task(vmdk_copy_task) [ 2307.622816] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2307.622816] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] return self.wait_for_task(task_ref) [ 2307.622816] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2307.622816] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] return evt.wait() [ 2307.622816] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2307.622816] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] result = hub.switch() [ 2307.622816] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2307.622816] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] return self.greenlet.switch() [ 2307.622816] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2307.622816] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] self.f(*self.args, **self.kw) [ 2307.622816] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2307.622816] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] raise exceptions.translate_fault(task_info.error) [ 2307.622816] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2307.622816] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Faults: ['InvalidArgument'] [ 2307.622816] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] [ 2307.623526] env[67893]: DEBUG nova.compute.utils [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2307.625145] env[67893]: DEBUG nova.compute.manager [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Build of instance 94760898-4f3c-4f41-85be-366f4108d0ba was re-scheduled: A specified parameter was not correct: fileType [ 2307.625145] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2307.625523] env[67893]: DEBUG nova.compute.manager [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2307.625773] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Acquiring lock "refresh_cache-94760898-4f3c-4f41-85be-366f4108d0ba" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2307.625908] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Acquired lock "refresh_cache-94760898-4f3c-4f41-85be-366f4108d0ba" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2307.626089] env[67893]: DEBUG nova.network.neutron [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2307.648416] env[67893]: DEBUG nova.network.neutron [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2307.703694] env[67893]: DEBUG nova.network.neutron [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2307.712533] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Releasing lock "refresh_cache-94760898-4f3c-4f41-85be-366f4108d0ba" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2307.712746] env[67893]: DEBUG nova.compute.manager [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2307.712928] env[67893]: DEBUG nova.compute.manager [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Skipping network deallocation for instance since networking was not requested. {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 2307.795472] env[67893]: INFO nova.scheduler.client.report [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Deleted allocations for instance 94760898-4f3c-4f41-85be-366f4108d0ba [ 2307.816344] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b2982d1f-9eee-40ad-aad3-4336d727f1b1 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Lock "94760898-4f3c-4f41-85be-366f4108d0ba" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 625.386s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2307.816635] env[67893]: DEBUG oslo_concurrency.lockutils [None req-2c71fbb1-0b8b-4f5f-a6b8-56cc9b463295 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Lock "94760898-4f3c-4f41-85be-366f4108d0ba" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 429.387s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2307.816870] env[67893]: DEBUG oslo_concurrency.lockutils [None req-2c71fbb1-0b8b-4f5f-a6b8-56cc9b463295 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Acquiring lock "94760898-4f3c-4f41-85be-366f4108d0ba-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2307.817087] env[67893]: DEBUG oslo_concurrency.lockutils [None req-2c71fbb1-0b8b-4f5f-a6b8-56cc9b463295 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Lock "94760898-4f3c-4f41-85be-366f4108d0ba-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2307.817256] env[67893]: DEBUG oslo_concurrency.lockutils [None req-2c71fbb1-0b8b-4f5f-a6b8-56cc9b463295 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Lock "94760898-4f3c-4f41-85be-366f4108d0ba-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2307.819414] env[67893]: INFO nova.compute.manager [None req-2c71fbb1-0b8b-4f5f-a6b8-56cc9b463295 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Terminating instance [ 2307.821014] env[67893]: DEBUG oslo_concurrency.lockutils [None req-2c71fbb1-0b8b-4f5f-a6b8-56cc9b463295 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Acquiring lock "refresh_cache-94760898-4f3c-4f41-85be-366f4108d0ba" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2307.821184] env[67893]: DEBUG oslo_concurrency.lockutils [None req-2c71fbb1-0b8b-4f5f-a6b8-56cc9b463295 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Acquired lock "refresh_cache-94760898-4f3c-4f41-85be-366f4108d0ba" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2307.821353] env[67893]: DEBUG nova.network.neutron [None req-2c71fbb1-0b8b-4f5f-a6b8-56cc9b463295 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2307.847947] env[67893]: DEBUG nova.network.neutron [None req-2c71fbb1-0b8b-4f5f-a6b8-56cc9b463295 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2307.940104] env[67893]: DEBUG nova.network.neutron [None req-2c71fbb1-0b8b-4f5f-a6b8-56cc9b463295 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2307.951326] env[67893]: DEBUG oslo_concurrency.lockutils [None req-2c71fbb1-0b8b-4f5f-a6b8-56cc9b463295 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Releasing lock "refresh_cache-94760898-4f3c-4f41-85be-366f4108d0ba" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2307.951754] env[67893]: DEBUG nova.compute.manager [None req-2c71fbb1-0b8b-4f5f-a6b8-56cc9b463295 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2307.951968] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-2c71fbb1-0b8b-4f5f-a6b8-56cc9b463295 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2307.952559] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-089143eb-a33e-4419-b381-cf7ee0546f1c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2307.964439] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82b3b1e5-2516-475f-91be-6ac09f50b5d2 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2307.992861] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-2c71fbb1-0b8b-4f5f-a6b8-56cc9b463295 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 94760898-4f3c-4f41-85be-366f4108d0ba could not be found. [ 2307.993087] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-2c71fbb1-0b8b-4f5f-a6b8-56cc9b463295 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2307.993273] env[67893]: INFO nova.compute.manager [None req-2c71fbb1-0b8b-4f5f-a6b8-56cc9b463295 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2307.993516] env[67893]: DEBUG oslo.service.loopingcall [None req-2c71fbb1-0b8b-4f5f-a6b8-56cc9b463295 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2307.993728] env[67893]: DEBUG nova.compute.manager [-] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2307.993829] env[67893]: DEBUG nova.network.neutron [-] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2308.107436] env[67893]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67893) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 2308.107711] env[67893]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-ff2e8586-626a-471c-bd2a-0887e969ee8f'] [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall self._deallocate_network( [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2308.108257] env[67893]: ERROR oslo.service.loopingcall [ 2308.109651] env[67893]: ERROR nova.compute.manager [None req-2c71fbb1-0b8b-4f5f-a6b8-56cc9b463295 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2308.139938] env[67893]: ERROR nova.compute.manager [None req-2c71fbb1-0b8b-4f5f-a6b8-56cc9b463295 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Traceback (most recent call last): [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] ret = obj(*args, **kwargs) [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] exception_handler_v20(status_code, error_body) [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] raise client_exc(message=error_message, [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Neutron server returns request_ids: ['req-ff2e8586-626a-471c-bd2a-0887e969ee8f'] [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] During handling of the above exception, another exception occurred: [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Traceback (most recent call last): [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] self._delete_instance(context, instance, bdms) [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] self._shutdown_instance(context, instance, bdms) [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] self._try_deallocate_network(context, instance, requested_networks) [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] with excutils.save_and_reraise_exception(): [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] self.force_reraise() [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] raise self.value [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] _deallocate_network_with_retries() [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] return evt.wait() [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] result = hub.switch() [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] return self.greenlet.switch() [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] result = func(*self.args, **self.kw) [ 2308.139938] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] result = f(*args, **kwargs) [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] self._deallocate_network( [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] self.network_api.deallocate_for_instance( [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] data = neutron.list_ports(**search_opts) [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] ret = obj(*args, **kwargs) [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] return self.list('ports', self.ports_path, retrieve_all, [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] ret = obj(*args, **kwargs) [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] for r in self._pagination(collection, path, **params): [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] res = self.get(path, params=params) [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] ret = obj(*args, **kwargs) [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] return self.retry_request("GET", action, body=body, [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] ret = obj(*args, **kwargs) [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] return self.do_request(method, action, body=body, [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] ret = obj(*args, **kwargs) [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] self._handle_fault_response(status_code, replybody, resp) [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2308.141096] env[67893]: ERROR nova.compute.manager [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] [ 2308.166211] env[67893]: DEBUG oslo_concurrency.lockutils [None req-2c71fbb1-0b8b-4f5f-a6b8-56cc9b463295 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Lock "94760898-4f3c-4f41-85be-366f4108d0ba" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.349s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2308.210022] env[67893]: INFO nova.compute.manager [None req-2c71fbb1-0b8b-4f5f-a6b8-56cc9b463295 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] [instance: 94760898-4f3c-4f41-85be-366f4108d0ba] Successfully reverted task state from None on failure for instance. [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server [None req-2c71fbb1-0b8b-4f5f-a6b8-56cc9b463295 tempest-ServerShowV254Test-1525439687 tempest-ServerShowV254Test-1525439687-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-ff2e8586-626a-471c-bd2a-0887e969ee8f'] [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server raise self.value [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server raise self.value [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server raise self.value [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 2308.213065] env[67893]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server raise self.value [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server raise self.value [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server return evt.wait() [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2308.214616] env[67893]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2308.216164] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 2308.216164] env[67893]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 2308.216164] env[67893]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 2308.216164] env[67893]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 2308.216164] env[67893]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2308.216164] env[67893]: ERROR oslo_messaging.rpc.server [ 2338.177671] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2339.859604] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2339.859861] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2339.859895] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2339.879530] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 11000d92-0094-4561-a807-ca76610ea549] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2339.879695] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2339.879826] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2339.879954] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2339.880093] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2339.880232] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2339.880356] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 878bdb96-bd90-47ab-904b-ce1d184ecc72] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2339.880476] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2339.880927] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2339.881116] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2342.875593] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2343.859377] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2343.859559] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2344.859761] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2345.858164] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2348.859846] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2348.871574] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2348.871802] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2348.871966] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2348.872138] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2348.873230] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69392b92-b497-4153-90db-2c35eb78b9c9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2348.881846] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb1ec34f-3aa9-4c02-9d24-000649248b8e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2348.895226] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28e6d1a2-098a-440c-90d6-3c000b846b39 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2348.901305] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e16edc33-167c-4e1d-acde-5a9d8a8cbfb7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2348.930408] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180954MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2348.930543] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2348.930733] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2348.991127] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 11000d92-0094-4561-a807-ca76610ea549 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2348.991302] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 7169c720-f69e-40a3-95d2-473639884cd9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2348.991435] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 72410dc2-74d9-4d59-bdd1-ad45b01c482b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2348.991562] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a5151a22-4174-4f66-a83a-55a0dd01c407 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2348.991683] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a71c2ee1-0286-4098-afca-f7666469a95f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2348.991804] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance d0f623f5-88e9-4806-8f30-584d277ba5fe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2348.991923] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 878bdb96-bd90-47ab-904b-ce1d184ecc72 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2348.992119] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2348.992261] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2349.073639] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50d53ae5-b286-4826-8c20-bdcfbf79fc2c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2349.080897] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-248c55be-a9c4-4406-99a4-116a190a2ea7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2349.109404] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87a97d96-276f-47f2-b341-fed9894fcdd8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2349.115788] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa8fc7bb-b086-4e8e-94b0-b325b153e9a4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2349.128449] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2349.136270] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2349.148980] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2349.149138] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.218s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2352.782363] env[67893]: WARNING oslo_vmware.rw_handles [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2352.782363] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2352.782363] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2352.782363] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2352.782363] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2352.782363] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 2352.782363] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2352.782363] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2352.782363] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2352.782363] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2352.782363] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2352.782363] env[67893]: ERROR oslo_vmware.rw_handles [ 2352.782971] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/a3967cdb-39b1-48a0-80b5-7133eaae3454/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2352.785184] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2352.785454] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Copying Virtual Disk [datastore1] vmware_temp/a3967cdb-39b1-48a0-80b5-7133eaae3454/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/a3967cdb-39b1-48a0-80b5-7133eaae3454/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2352.785746] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-10c146c4-d12f-4144-bbd6-b73c83aa8a43 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2352.794456] env[67893]: DEBUG oslo_vmware.api [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Waiting for the task: (returnval){ [ 2352.794456] env[67893]: value = "task-3455517" [ 2352.794456] env[67893]: _type = "Task" [ 2352.794456] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2352.802051] env[67893]: DEBUG oslo_vmware.api [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Task: {'id': task-3455517, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2353.305655] env[67893]: DEBUG oslo_vmware.exceptions [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2353.305946] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2353.306547] env[67893]: ERROR nova.compute.manager [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2353.306547] env[67893]: Faults: ['InvalidArgument'] [ 2353.306547] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] Traceback (most recent call last): [ 2353.306547] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2353.306547] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] yield resources [ 2353.306547] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2353.306547] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] self.driver.spawn(context, instance, image_meta, [ 2353.306547] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2353.306547] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2353.306547] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2353.306547] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] self._fetch_image_if_missing(context, vi) [ 2353.306547] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2353.306547] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] image_cache(vi, tmp_image_ds_loc) [ 2353.306547] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2353.306547] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] vm_util.copy_virtual_disk( [ 2353.306547] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2353.306547] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] session._wait_for_task(vmdk_copy_task) [ 2353.306547] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2353.306547] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] return self.wait_for_task(task_ref) [ 2353.306547] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2353.306547] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] return evt.wait() [ 2353.306547] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2353.306547] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] result = hub.switch() [ 2353.306547] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2353.306547] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] return self.greenlet.switch() [ 2353.306547] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2353.306547] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] self.f(*self.args, **self.kw) [ 2353.306547] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2353.306547] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] raise exceptions.translate_fault(task_info.error) [ 2353.306547] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2353.306547] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] Faults: ['InvalidArgument'] [ 2353.306547] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] [ 2353.307576] env[67893]: INFO nova.compute.manager [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Terminating instance [ 2353.308406] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2353.308627] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2353.308873] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3da1e1c6-2751-4f6f-a1c2-142caa3e79f0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2353.310931] env[67893]: DEBUG nova.compute.manager [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2353.311132] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2353.311833] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-138804b6-7ab5-4700-b679-da43a4612542 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2353.318257] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2353.318463] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e7fe0d1e-2db5-455e-8b7b-0e71988453b8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2353.320576] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2353.320751] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2353.321899] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c6c280c3-ce99-47bb-9e9c-85c2e3907c91 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2353.327426] env[67893]: DEBUG oslo_vmware.api [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Waiting for the task: (returnval){ [ 2353.327426] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52d03fe5-006b-9bb7-753a-c0c07f10910b" [ 2353.327426] env[67893]: _type = "Task" [ 2353.327426] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2353.341370] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2353.341613] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Creating directory with path [datastore1] vmware_temp/f60be0f8-ff2a-45fc-a23a-d816e9c3653b/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2353.341860] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-62fc0429-e190-461e-86b0-6de55477424f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2353.361454] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Created directory with path [datastore1] vmware_temp/f60be0f8-ff2a-45fc-a23a-d816e9c3653b/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2353.361662] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Fetch image to [datastore1] vmware_temp/f60be0f8-ff2a-45fc-a23a-d816e9c3653b/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2353.361837] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/f60be0f8-ff2a-45fc-a23a-d816e9c3653b/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2353.362624] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66adbc73-119b-47fe-8b65-047cb37abfdb {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2353.369183] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee7d10df-dd99-496b-b292-b8e99485c396 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2353.378554] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4b7586b-f48f-4689-b9bd-7d164d24ecea {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2353.410610] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-705053cc-c1e7-4374-9a37-6b4054e1fd29 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2353.413228] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2353.413421] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2353.413598] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Deleting the datastore file [datastore1] 11000d92-0094-4561-a807-ca76610ea549 {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2353.413833] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ec4e62f7-2ac2-4777-9734-1384e724836c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2353.419372] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-462a01b2-ec50-4e1b-a29b-fbd27f5ad519 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2353.421108] env[67893]: DEBUG oslo_vmware.api [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Waiting for the task: (returnval){ [ 2353.421108] env[67893]: value = "task-3455519" [ 2353.421108] env[67893]: _type = "Task" [ 2353.421108] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2353.429891] env[67893]: DEBUG oslo_vmware.api [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Task: {'id': task-3455519, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2353.440276] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2353.512963] env[67893]: DEBUG oslo_vmware.rw_handles [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f60be0f8-ff2a-45fc-a23a-d816e9c3653b/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2353.572752] env[67893]: DEBUG oslo_vmware.rw_handles [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2353.572974] env[67893]: DEBUG oslo_vmware.rw_handles [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f60be0f8-ff2a-45fc-a23a-d816e9c3653b/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2353.931657] env[67893]: DEBUG oslo_vmware.api [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Task: {'id': task-3455519, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.071475} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2353.931978] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2353.932042] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2353.932205] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2353.932375] env[67893]: INFO nova.compute.manager [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Took 0.62 seconds to destroy the instance on the hypervisor. [ 2353.934431] env[67893]: DEBUG nova.compute.claims [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2353.934618] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2353.934843] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2354.076415] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bf336f4-2df3-4711-9a69-ab2c1cd20cce {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2354.083384] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c16c7503-a362-4154-9aab-6cf67f05d4f4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2354.111639] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e5df2fc-6bcd-42f4-abce-194b2f565c3e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2354.118198] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c326916b-4665-4762-9cf0-2adc5ae0982a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2354.130859] env[67893]: DEBUG nova.compute.provider_tree [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2354.139118] env[67893]: DEBUG nova.scheduler.client.report [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2354.153544] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.219s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2354.154073] env[67893]: ERROR nova.compute.manager [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2354.154073] env[67893]: Faults: ['InvalidArgument'] [ 2354.154073] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] Traceback (most recent call last): [ 2354.154073] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2354.154073] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] self.driver.spawn(context, instance, image_meta, [ 2354.154073] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2354.154073] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2354.154073] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2354.154073] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] self._fetch_image_if_missing(context, vi) [ 2354.154073] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2354.154073] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] image_cache(vi, tmp_image_ds_loc) [ 2354.154073] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2354.154073] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] vm_util.copy_virtual_disk( [ 2354.154073] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2354.154073] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] session._wait_for_task(vmdk_copy_task) [ 2354.154073] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2354.154073] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] return self.wait_for_task(task_ref) [ 2354.154073] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2354.154073] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] return evt.wait() [ 2354.154073] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2354.154073] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] result = hub.switch() [ 2354.154073] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2354.154073] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] return self.greenlet.switch() [ 2354.154073] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2354.154073] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] self.f(*self.args, **self.kw) [ 2354.154073] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2354.154073] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] raise exceptions.translate_fault(task_info.error) [ 2354.154073] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2354.154073] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] Faults: ['InvalidArgument'] [ 2354.154073] env[67893]: ERROR nova.compute.manager [instance: 11000d92-0094-4561-a807-ca76610ea549] [ 2354.154954] env[67893]: DEBUG nova.compute.utils [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2354.156075] env[67893]: DEBUG nova.compute.manager [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Build of instance 11000d92-0094-4561-a807-ca76610ea549 was re-scheduled: A specified parameter was not correct: fileType [ 2354.156075] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2354.156480] env[67893]: DEBUG nova.compute.manager [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2354.156658] env[67893]: DEBUG nova.compute.manager [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2354.156838] env[67893]: DEBUG nova.compute.manager [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2354.156996] env[67893]: DEBUG nova.network.neutron [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2354.700029] env[67893]: DEBUG nova.network.neutron [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2354.711746] env[67893]: INFO nova.compute.manager [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Took 0.55 seconds to deallocate network for instance. [ 2354.911423] env[67893]: INFO nova.scheduler.client.report [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Deleted allocations for instance 11000d92-0094-4561-a807-ca76610ea549 [ 2354.960223] env[67893]: DEBUG oslo_concurrency.lockutils [None req-d4eedff6-1bae-4734-bd3f-5ce51f2cd474 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Lock "11000d92-0094-4561-a807-ca76610ea549" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 668.359s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2354.960589] env[67893]: DEBUG oslo_concurrency.lockutils [None req-6b0ef2be-21ec-4e27-881d-14b43e209680 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Lock "11000d92-0094-4561-a807-ca76610ea549" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 473.241s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2354.960721] env[67893]: DEBUG oslo_concurrency.lockutils [None req-6b0ef2be-21ec-4e27-881d-14b43e209680 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquiring lock "11000d92-0094-4561-a807-ca76610ea549-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2354.960929] env[67893]: DEBUG oslo_concurrency.lockutils [None req-6b0ef2be-21ec-4e27-881d-14b43e209680 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Lock "11000d92-0094-4561-a807-ca76610ea549-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2354.961109] env[67893]: DEBUG oslo_concurrency.lockutils [None req-6b0ef2be-21ec-4e27-881d-14b43e209680 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Lock "11000d92-0094-4561-a807-ca76610ea549-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2354.963368] env[67893]: INFO nova.compute.manager [None req-6b0ef2be-21ec-4e27-881d-14b43e209680 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Terminating instance [ 2354.965062] env[67893]: DEBUG nova.compute.manager [None req-6b0ef2be-21ec-4e27-881d-14b43e209680 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2354.965258] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-6b0ef2be-21ec-4e27-881d-14b43e209680 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2354.965724] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b288d829-1218-4b22-bebb-69f1b15f90ff {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2354.974784] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4236d4d-3f01-4890-b227-deafee1e950a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2355.002229] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-6b0ef2be-21ec-4e27-881d-14b43e209680 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 11000d92-0094-4561-a807-ca76610ea549 could not be found. [ 2355.002382] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-6b0ef2be-21ec-4e27-881d-14b43e209680 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2355.002557] env[67893]: INFO nova.compute.manager [None req-6b0ef2be-21ec-4e27-881d-14b43e209680 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 11000d92-0094-4561-a807-ca76610ea549] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2355.002786] env[67893]: DEBUG oslo.service.loopingcall [None req-6b0ef2be-21ec-4e27-881d-14b43e209680 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2355.003250] env[67893]: DEBUG nova.compute.manager [-] [instance: 11000d92-0094-4561-a807-ca76610ea549] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2355.003347] env[67893]: DEBUG nova.network.neutron [-] [instance: 11000d92-0094-4561-a807-ca76610ea549] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2355.026677] env[67893]: DEBUG nova.network.neutron [-] [instance: 11000d92-0094-4561-a807-ca76610ea549] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2355.040291] env[67893]: INFO nova.compute.manager [-] [instance: 11000d92-0094-4561-a807-ca76610ea549] Took 0.04 seconds to deallocate network for instance. [ 2355.401739] env[67893]: DEBUG oslo_concurrency.lockutils [None req-6b0ef2be-21ec-4e27-881d-14b43e209680 tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Lock "11000d92-0094-4561-a807-ca76610ea549" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.441s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2363.401713] env[67893]: DEBUG oslo_concurrency.lockutils [None req-0fceee95-871d-43b3-ac9c-e6c14d400df9 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "d0f623f5-88e9-4806-8f30-584d277ba5fe" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2390.786542] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2390.786997] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Getting list of instances from cluster (obj){ [ 2390.786997] env[67893]: value = "domain-c8" [ 2390.786997] env[67893]: _type = "ClusterComputeResource" [ 2390.786997] env[67893]: } {{(pid=67893) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 2390.788085] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53767c54-6d95-4ccc-a5ff-739272275e3f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2390.803128] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Got total of 6 instances {{(pid=67893) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 2398.892980] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2399.859221] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2400.860053] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2400.860053] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2400.860053] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2400.875336] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2400.875488] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2400.875612] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2400.875739] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2400.875862] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2400.875981] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 878bdb96-bd90-47ab-904b-ce1d184ecc72] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2400.876118] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2401.185536] env[67893]: WARNING oslo_vmware.rw_handles [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2401.185536] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2401.185536] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2401.185536] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2401.185536] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2401.185536] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 2401.185536] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2401.185536] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2401.185536] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2401.185536] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2401.185536] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2401.185536] env[67893]: ERROR oslo_vmware.rw_handles [ 2401.185965] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/f60be0f8-ff2a-45fc-a23a-d816e9c3653b/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2401.188276] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2401.188519] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Copying Virtual Disk [datastore1] vmware_temp/f60be0f8-ff2a-45fc-a23a-d816e9c3653b/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/f60be0f8-ff2a-45fc-a23a-d816e9c3653b/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2401.188793] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c9251e63-f3eb-4b98-864e-8656b500d90b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2401.196784] env[67893]: DEBUG oslo_vmware.api [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Waiting for the task: (returnval){ [ 2401.196784] env[67893]: value = "task-3455520" [ 2401.196784] env[67893]: _type = "Task" [ 2401.196784] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2401.204945] env[67893]: DEBUG oslo_vmware.api [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Task: {'id': task-3455520, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2401.706902] env[67893]: DEBUG oslo_vmware.exceptions [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2401.707237] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2401.707782] env[67893]: ERROR nova.compute.manager [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2401.707782] env[67893]: Faults: ['InvalidArgument'] [ 2401.707782] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Traceback (most recent call last): [ 2401.707782] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2401.707782] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] yield resources [ 2401.707782] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2401.707782] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] self.driver.spawn(context, instance, image_meta, [ 2401.707782] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2401.707782] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2401.707782] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2401.707782] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] self._fetch_image_if_missing(context, vi) [ 2401.707782] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2401.707782] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] image_cache(vi, tmp_image_ds_loc) [ 2401.707782] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2401.707782] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] vm_util.copy_virtual_disk( [ 2401.707782] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2401.707782] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] session._wait_for_task(vmdk_copy_task) [ 2401.707782] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2401.707782] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] return self.wait_for_task(task_ref) [ 2401.707782] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2401.707782] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] return evt.wait() [ 2401.707782] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2401.707782] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] result = hub.switch() [ 2401.707782] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2401.707782] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] return self.greenlet.switch() [ 2401.707782] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2401.707782] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] self.f(*self.args, **self.kw) [ 2401.707782] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2401.707782] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] raise exceptions.translate_fault(task_info.error) [ 2401.707782] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2401.707782] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Faults: ['InvalidArgument'] [ 2401.707782] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] [ 2401.708640] env[67893]: INFO nova.compute.manager [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Terminating instance [ 2401.709634] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2401.709838] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2401.710089] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5d8de574-662a-4d8c-b815-d70702dfa29d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2401.712177] env[67893]: DEBUG nova.compute.manager [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2401.712368] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2401.713078] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2934cbd5-5fef-42b8-9212-2a5fc5db4c2a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2401.719677] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2401.719932] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0e465b7a-9b98-4e54-b10d-a6f982e5f400 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2401.722093] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2401.722269] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2401.723210] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-629d2896-59be-4ee7-bb58-67ffaa00d969 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2401.727894] env[67893]: DEBUG oslo_vmware.api [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Waiting for the task: (returnval){ [ 2401.727894] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52a66bfc-568c-bdea-b66f-04119495e4a7" [ 2401.727894] env[67893]: _type = "Task" [ 2401.727894] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2401.734894] env[67893]: DEBUG oslo_vmware.api [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52a66bfc-568c-bdea-b66f-04119495e4a7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2401.792138] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2401.792332] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2401.792485] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Deleting the datastore file [datastore1] 7169c720-f69e-40a3-95d2-473639884cd9 {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2401.792740] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a94207c6-afe6-4daa-84ef-eaf14487977c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2401.798694] env[67893]: DEBUG oslo_vmware.api [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Waiting for the task: (returnval){ [ 2401.798694] env[67893]: value = "task-3455522" [ 2401.798694] env[67893]: _type = "Task" [ 2401.798694] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2401.805975] env[67893]: DEBUG oslo_vmware.api [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Task: {'id': task-3455522, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2401.858527] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2402.237799] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2402.238097] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Creating directory with path [datastore1] vmware_temp/392bf0b6-1f81-414e-8767-0ee65d686e27/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2402.238355] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-19c54608-d6dc-4caf-a6b1-b859bf730006 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2402.248952] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Created directory with path [datastore1] vmware_temp/392bf0b6-1f81-414e-8767-0ee65d686e27/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2402.249149] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Fetch image to [datastore1] vmware_temp/392bf0b6-1f81-414e-8767-0ee65d686e27/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2402.249326] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/392bf0b6-1f81-414e-8767-0ee65d686e27/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2402.250045] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a571074f-9145-4ec6-a2a3-84d97e486400 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2402.256777] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-419a3c7a-a474-4429-95d9-c4eda8e9b116 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2402.265336] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4da6345-8c3d-4955-87fb-4c156924023d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2402.296599] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff33093d-b5e6-4028-a97e-fcfe406e508c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2402.303521] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-6a79e0ff-09a3-4819-a438-4e4f91b0ff71 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2402.307632] env[67893]: DEBUG oslo_vmware.api [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Task: {'id': task-3455522, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069668} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2402.308179] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2402.308364] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2402.308530] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2402.308700] env[67893]: INFO nova.compute.manager [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2402.310717] env[67893]: DEBUG nova.compute.claims [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2402.310882] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2402.311116] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2402.329533] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2402.384217] env[67893]: DEBUG oslo_vmware.rw_handles [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/392bf0b6-1f81-414e-8767-0ee65d686e27/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2402.444750] env[67893]: DEBUG oslo_vmware.rw_handles [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2402.444940] env[67893]: DEBUG oslo_vmware.rw_handles [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/392bf0b6-1f81-414e-8767-0ee65d686e27/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2402.484285] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6074cb4-dd84-4a1c-ab15-bfa0228a2274 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2402.491818] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f401e20-cc93-47b1-a3e0-ed61e3798ee0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2402.520277] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02ccd210-a9cc-4fbe-8325-5a83be975560 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2402.526835] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02b5dcc2-2f02-44c2-9f0c-b5774d94dace {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2402.540993] env[67893]: DEBUG nova.compute.provider_tree [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2402.549738] env[67893]: DEBUG nova.scheduler.client.report [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2402.564584] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.253s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2402.565130] env[67893]: ERROR nova.compute.manager [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2402.565130] env[67893]: Faults: ['InvalidArgument'] [ 2402.565130] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Traceback (most recent call last): [ 2402.565130] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2402.565130] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] self.driver.spawn(context, instance, image_meta, [ 2402.565130] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2402.565130] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2402.565130] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2402.565130] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] self._fetch_image_if_missing(context, vi) [ 2402.565130] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2402.565130] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] image_cache(vi, tmp_image_ds_loc) [ 2402.565130] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2402.565130] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] vm_util.copy_virtual_disk( [ 2402.565130] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2402.565130] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] session._wait_for_task(vmdk_copy_task) [ 2402.565130] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2402.565130] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] return self.wait_for_task(task_ref) [ 2402.565130] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2402.565130] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] return evt.wait() [ 2402.565130] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2402.565130] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] result = hub.switch() [ 2402.565130] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2402.565130] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] return self.greenlet.switch() [ 2402.565130] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2402.565130] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] self.f(*self.args, **self.kw) [ 2402.565130] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2402.565130] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] raise exceptions.translate_fault(task_info.error) [ 2402.565130] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2402.565130] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Faults: ['InvalidArgument'] [ 2402.565130] env[67893]: ERROR nova.compute.manager [instance: 7169c720-f69e-40a3-95d2-473639884cd9] [ 2402.566048] env[67893]: DEBUG nova.compute.utils [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2402.567240] env[67893]: DEBUG nova.compute.manager [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Build of instance 7169c720-f69e-40a3-95d2-473639884cd9 was re-scheduled: A specified parameter was not correct: fileType [ 2402.567240] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2402.567607] env[67893]: DEBUG nova.compute.manager [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2402.567771] env[67893]: DEBUG nova.compute.manager [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2402.567936] env[67893]: DEBUG nova.compute.manager [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2402.568110] env[67893]: DEBUG nova.network.neutron [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2402.884316] env[67893]: DEBUG nova.network.neutron [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2402.893869] env[67893]: INFO nova.compute.manager [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Took 0.33 seconds to deallocate network for instance. [ 2402.998598] env[67893]: INFO nova.scheduler.client.report [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Deleted allocations for instance 7169c720-f69e-40a3-95d2-473639884cd9 [ 2403.023298] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7457fa98-969f-404a-9218-ed2d8a576c09 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "7169c720-f69e-40a3-95d2-473639884cd9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 627.418s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2403.023649] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dee9e879-9e3e-4c1a-818e-f071a6166f9d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "7169c720-f69e-40a3-95d2-473639884cd9" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 431.883s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2403.023859] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dee9e879-9e3e-4c1a-818e-f071a6166f9d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "7169c720-f69e-40a3-95d2-473639884cd9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2403.024074] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dee9e879-9e3e-4c1a-818e-f071a6166f9d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "7169c720-f69e-40a3-95d2-473639884cd9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2403.024245] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dee9e879-9e3e-4c1a-818e-f071a6166f9d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "7169c720-f69e-40a3-95d2-473639884cd9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2403.026716] env[67893]: INFO nova.compute.manager [None req-dee9e879-9e3e-4c1a-818e-f071a6166f9d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Terminating instance [ 2403.028851] env[67893]: DEBUG nova.compute.manager [None req-dee9e879-9e3e-4c1a-818e-f071a6166f9d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2403.029080] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-dee9e879-9e3e-4c1a-818e-f071a6166f9d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2403.029322] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-955e233b-bab8-474f-8cab-cf25e87e4561 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2403.039379] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5739a991-4411-4efe-af23-0f995fa21fbb {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2403.064388] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-dee9e879-9e3e-4c1a-818e-f071a6166f9d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7169c720-f69e-40a3-95d2-473639884cd9 could not be found. [ 2403.064594] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-dee9e879-9e3e-4c1a-818e-f071a6166f9d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2403.064771] env[67893]: INFO nova.compute.manager [None req-dee9e879-9e3e-4c1a-818e-f071a6166f9d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2403.065153] env[67893]: DEBUG oslo.service.loopingcall [None req-dee9e879-9e3e-4c1a-818e-f071a6166f9d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2403.065611] env[67893]: DEBUG nova.compute.manager [-] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2403.065714] env[67893]: DEBUG nova.network.neutron [-] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2403.087913] env[67893]: DEBUG nova.network.neutron [-] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2403.095429] env[67893]: INFO nova.compute.manager [-] [instance: 7169c720-f69e-40a3-95d2-473639884cd9] Took 0.03 seconds to deallocate network for instance. [ 2403.177051] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dee9e879-9e3e-4c1a-818e-f071a6166f9d tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "7169c720-f69e-40a3-95d2-473639884cd9" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.153s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2403.853703] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2404.859376] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2405.859620] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2405.859959] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2405.860016] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2405.860151] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Cleaning up deleted instances {{(pid=67893) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 2405.868795] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] There are 0 instances to clean {{(pid=67893) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 2407.868626] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2408.859837] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2408.860041] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Cleaning up deleted instances with incomplete migration {{(pid=67893) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 2408.867656] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2410.868682] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2410.883731] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2410.895282] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2410.895513] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2410.895677] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2410.895839] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2410.896905] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8420df5-77e9-4e00-88e2-0ac662e4f006 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2410.905569] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-353bda56-49ee-4d25-b839-ca93753cf4ef {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2410.919321] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a1d946f-dc58-486f-b602-a61a33c62ae1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2410.925650] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4605efc-7661-4182-8875-12abe35bfb23 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2410.953707] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180999MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2410.953894] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2410.954105] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2411.043380] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 72410dc2-74d9-4d59-bdd1-ad45b01c482b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2411.043548] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a5151a22-4174-4f66-a83a-55a0dd01c407 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2411.043672] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a71c2ee1-0286-4098-afca-f7666469a95f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2411.043791] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance d0f623f5-88e9-4806-8f30-584d277ba5fe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2411.043909] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 878bdb96-bd90-47ab-904b-ce1d184ecc72 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2411.044106] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 5 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2411.044249] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1152MB phys_disk=200GB used_disk=5GB total_vcpus=48 used_vcpus=5 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2411.059152] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Refreshing inventories for resource provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 2411.071857] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Updating ProviderTree inventory for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 2411.072042] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Updating inventory in ProviderTree for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2411.081674] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Refreshing aggregate associations for resource provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57, aggregates: None {{(pid=67893) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 2411.097711] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Refreshing trait associations for resource provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO {{(pid=67893) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 2411.161607] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3aa5170d-ef14-429c-8213-81dae041b632 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2411.170183] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a166f4a7-a291-4f21-9533-38df6e5afe1e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2411.198384] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b8337b8-065c-4327-853b-1e7276798230 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2411.205295] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bfb8025-664d-444a-b903-10d4f2b341cc {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2411.217862] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2411.226835] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2411.242553] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2411.242743] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.289s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2415.124972] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._sync_power_states {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2415.139507] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Getting list of instances from cluster (obj){ [ 2415.139507] env[67893]: value = "domain-c8" [ 2415.139507] env[67893]: _type = "ClusterComputeResource" [ 2415.139507] env[67893]: } {{(pid=67893) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 2415.141790] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa7d55be-44d1-465f-8e00-572dd88ab546 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2415.155679] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Got total of 5 instances {{(pid=67893) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 2415.155846] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Triggering sync for uuid 72410dc2-74d9-4d59-bdd1-ad45b01c482b {{(pid=67893) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2415.156043] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Triggering sync for uuid a5151a22-4174-4f66-a83a-55a0dd01c407 {{(pid=67893) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2415.156208] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Triggering sync for uuid a71c2ee1-0286-4098-afca-f7666469a95f {{(pid=67893) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2415.156372] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Triggering sync for uuid d0f623f5-88e9-4806-8f30-584d277ba5fe {{(pid=67893) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2415.156518] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Triggering sync for uuid 878bdb96-bd90-47ab-904b-ce1d184ecc72 {{(pid=67893) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2415.156809] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "72410dc2-74d9-4d59-bdd1-ad45b01c482b" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2415.157047] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "a5151a22-4174-4f66-a83a-55a0dd01c407" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2415.157281] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "a71c2ee1-0286-4098-afca-f7666469a95f" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2415.157484] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "d0f623f5-88e9-4806-8f30-584d277ba5fe" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2415.157677] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "878bdb96-bd90-47ab-904b-ce1d184ecc72" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2419.605241] env[67893]: DEBUG oslo_concurrency.lockutils [None req-040e0d6b-509a-4fd8-a899-25fa99c30631 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquiring lock "878bdb96-bd90-47ab-904b-ce1d184ecc72" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2447.810908] env[67893]: WARNING oslo_vmware.rw_handles [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2447.810908] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2447.810908] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2447.810908] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2447.810908] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2447.810908] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 2447.810908] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2447.810908] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2447.810908] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2447.810908] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2447.810908] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2447.810908] env[67893]: ERROR oslo_vmware.rw_handles [ 2447.811575] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/392bf0b6-1f81-414e-8767-0ee65d686e27/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2447.813294] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2447.813531] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Copying Virtual Disk [datastore1] vmware_temp/392bf0b6-1f81-414e-8767-0ee65d686e27/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/392bf0b6-1f81-414e-8767-0ee65d686e27/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2447.814406] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-3f60d75f-170e-494c-b532-ba414891479d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2447.821707] env[67893]: DEBUG oslo_vmware.api [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Waiting for the task: (returnval){ [ 2447.821707] env[67893]: value = "task-3455523" [ 2447.821707] env[67893]: _type = "Task" [ 2447.821707] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2447.829244] env[67893]: DEBUG oslo_vmware.api [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Task: {'id': task-3455523, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2448.331980] env[67893]: DEBUG oslo_vmware.exceptions [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2448.332305] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2448.332838] env[67893]: ERROR nova.compute.manager [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2448.332838] env[67893]: Faults: ['InvalidArgument'] [ 2448.332838] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Traceback (most recent call last): [ 2448.332838] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2448.332838] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] yield resources [ 2448.332838] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2448.332838] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] self.driver.spawn(context, instance, image_meta, [ 2448.332838] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2448.332838] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2448.332838] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2448.332838] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] self._fetch_image_if_missing(context, vi) [ 2448.332838] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2448.332838] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] image_cache(vi, tmp_image_ds_loc) [ 2448.332838] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2448.332838] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] vm_util.copy_virtual_disk( [ 2448.332838] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2448.332838] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] session._wait_for_task(vmdk_copy_task) [ 2448.332838] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2448.332838] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] return self.wait_for_task(task_ref) [ 2448.332838] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2448.332838] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] return evt.wait() [ 2448.332838] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2448.332838] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] result = hub.switch() [ 2448.332838] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2448.332838] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] return self.greenlet.switch() [ 2448.332838] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2448.332838] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] self.f(*self.args, **self.kw) [ 2448.332838] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2448.332838] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] raise exceptions.translate_fault(task_info.error) [ 2448.332838] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2448.332838] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Faults: ['InvalidArgument'] [ 2448.332838] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] [ 2448.333863] env[67893]: INFO nova.compute.manager [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Terminating instance [ 2448.334693] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2448.334901] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2448.335159] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-618c8eb9-59b1-4a9a-a970-54b3feda1667 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2448.337502] env[67893]: DEBUG nova.compute.manager [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2448.337721] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2448.338494] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be5a6a41-12b6-4aab-915b-c2468d84bdf2 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2448.344897] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2448.345174] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4b9bd667-2eba-4401-b602-b2ea15bcec0a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2448.347475] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2448.347675] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2448.348642] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-acf549df-681e-4b62-841c-6cd699f2bf45 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2448.353772] env[67893]: DEBUG oslo_vmware.api [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Waiting for the task: (returnval){ [ 2448.353772] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]5265e666-eacc-1297-a94e-804f45263acf" [ 2448.353772] env[67893]: _type = "Task" [ 2448.353772] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2448.360452] env[67893]: DEBUG oslo_vmware.api [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]5265e666-eacc-1297-a94e-804f45263acf, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2448.418619] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2448.418851] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2448.419016] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Deleting the datastore file [datastore1] 72410dc2-74d9-4d59-bdd1-ad45b01c482b {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2448.419295] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-aaefa4af-0391-4bb7-909e-2283f9f63a47 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2448.425619] env[67893]: DEBUG oslo_vmware.api [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Waiting for the task: (returnval){ [ 2448.425619] env[67893]: value = "task-3455525" [ 2448.425619] env[67893]: _type = "Task" [ 2448.425619] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2448.433347] env[67893]: DEBUG oslo_vmware.api [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Task: {'id': task-3455525, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2448.864272] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2448.864549] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Creating directory with path [datastore1] vmware_temp/81b410a4-6ebb-4bc9-aef0-fb410c04260d/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2448.864747] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ae04eb38-ff2e-435c-8088-fb24607b661e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2448.875689] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Created directory with path [datastore1] vmware_temp/81b410a4-6ebb-4bc9-aef0-fb410c04260d/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2448.875883] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Fetch image to [datastore1] vmware_temp/81b410a4-6ebb-4bc9-aef0-fb410c04260d/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2448.876065] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/81b410a4-6ebb-4bc9-aef0-fb410c04260d/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2448.876746] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-299a335e-e7a8-4953-b2b3-11c846b595c8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2448.883047] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6ff8f28-2790-4b2c-9653-1ca1b66629b8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2448.891450] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd44e754-66bb-4c02-b808-6841fe7f676e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2448.921714] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e83fc812-47fa-4951-9d65-ea2cbdcc6bb4 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2448.929454] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7ff8caac-1ae8-442a-a331-d62af77fad5c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2448.935111] env[67893]: DEBUG oslo_vmware.api [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Task: {'id': task-3455525, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077023} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2448.935334] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2448.935510] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2448.935738] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2448.935932] env[67893]: INFO nova.compute.manager [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2448.937965] env[67893]: DEBUG nova.compute.claims [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2448.938144] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2448.938357] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2448.949806] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2448.999763] env[67893]: DEBUG oslo_vmware.rw_handles [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/81b410a4-6ebb-4bc9-aef0-fb410c04260d/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2449.060377] env[67893]: DEBUG oslo_vmware.rw_handles [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2449.060487] env[67893]: DEBUG oslo_vmware.rw_handles [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/81b410a4-6ebb-4bc9-aef0-fb410c04260d/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2449.099793] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-900ae6e7-57ff-48cd-9a9e-fcbd8f2336b9 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2449.106751] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57775f3c-bc21-46eb-8d77-57604ac1621b {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2449.135628] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a67d0698-c1d4-4d61-b947-d2aa0f1541e8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2449.142358] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f33964b2-bd97-4e89-a863-918a7fc3ba42 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2449.156053] env[67893]: DEBUG nova.compute.provider_tree [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2449.165094] env[67893]: DEBUG nova.scheduler.client.report [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2449.179637] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.241s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2449.180188] env[67893]: ERROR nova.compute.manager [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2449.180188] env[67893]: Faults: ['InvalidArgument'] [ 2449.180188] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Traceback (most recent call last): [ 2449.180188] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2449.180188] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] self.driver.spawn(context, instance, image_meta, [ 2449.180188] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2449.180188] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2449.180188] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2449.180188] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] self._fetch_image_if_missing(context, vi) [ 2449.180188] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2449.180188] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] image_cache(vi, tmp_image_ds_loc) [ 2449.180188] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2449.180188] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] vm_util.copy_virtual_disk( [ 2449.180188] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2449.180188] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] session._wait_for_task(vmdk_copy_task) [ 2449.180188] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2449.180188] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] return self.wait_for_task(task_ref) [ 2449.180188] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2449.180188] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] return evt.wait() [ 2449.180188] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2449.180188] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] result = hub.switch() [ 2449.180188] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2449.180188] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] return self.greenlet.switch() [ 2449.180188] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2449.180188] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] self.f(*self.args, **self.kw) [ 2449.180188] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2449.180188] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] raise exceptions.translate_fault(task_info.error) [ 2449.180188] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2449.180188] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Faults: ['InvalidArgument'] [ 2449.180188] env[67893]: ERROR nova.compute.manager [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] [ 2449.180884] env[67893]: DEBUG nova.compute.utils [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2449.182201] env[67893]: DEBUG nova.compute.manager [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Build of instance 72410dc2-74d9-4d59-bdd1-ad45b01c482b was re-scheduled: A specified parameter was not correct: fileType [ 2449.182201] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2449.182559] env[67893]: DEBUG nova.compute.manager [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2449.182729] env[67893]: DEBUG nova.compute.manager [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2449.182895] env[67893]: DEBUG nova.compute.manager [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2449.183065] env[67893]: DEBUG nova.network.neutron [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2449.503118] env[67893]: DEBUG nova.network.neutron [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2449.518499] env[67893]: INFO nova.compute.manager [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Took 0.34 seconds to deallocate network for instance. [ 2449.607089] env[67893]: INFO nova.scheduler.client.report [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Deleted allocations for instance 72410dc2-74d9-4d59-bdd1-ad45b01c482b [ 2449.630438] env[67893]: DEBUG oslo_concurrency.lockutils [None req-f0c1f650-0cbf-40da-a378-937dc6b11056 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "72410dc2-74d9-4d59-bdd1-ad45b01c482b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 618.264s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2449.630879] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dfdbc632-6bd6-4c5e-af06-f77846119324 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "72410dc2-74d9-4d59-bdd1-ad45b01c482b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 422.298s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2449.631153] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dfdbc632-6bd6-4c5e-af06-f77846119324 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Acquiring lock "72410dc2-74d9-4d59-bdd1-ad45b01c482b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2449.631415] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dfdbc632-6bd6-4c5e-af06-f77846119324 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "72410dc2-74d9-4d59-bdd1-ad45b01c482b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2449.631545] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dfdbc632-6bd6-4c5e-af06-f77846119324 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "72410dc2-74d9-4d59-bdd1-ad45b01c482b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2449.633499] env[67893]: INFO nova.compute.manager [None req-dfdbc632-6bd6-4c5e-af06-f77846119324 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Terminating instance [ 2449.637181] env[67893]: DEBUG nova.compute.manager [None req-dfdbc632-6bd6-4c5e-af06-f77846119324 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2449.637380] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-dfdbc632-6bd6-4c5e-af06-f77846119324 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2449.637670] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-4c33a238-ee2e-4dcb-9297-3fca46b7d6bb {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2449.646691] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc7108a6-d3b4-4d5e-a406-0628fce6ee33 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2449.672212] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-dfdbc632-6bd6-4c5e-af06-f77846119324 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 72410dc2-74d9-4d59-bdd1-ad45b01c482b could not be found. [ 2449.672430] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-dfdbc632-6bd6-4c5e-af06-f77846119324 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2449.672609] env[67893]: INFO nova.compute.manager [None req-dfdbc632-6bd6-4c5e-af06-f77846119324 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2449.672852] env[67893]: DEBUG oslo.service.loopingcall [None req-dfdbc632-6bd6-4c5e-af06-f77846119324 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2449.673379] env[67893]: DEBUG nova.compute.manager [-] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2449.673480] env[67893]: DEBUG nova.network.neutron [-] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2449.695966] env[67893]: DEBUG nova.network.neutron [-] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2449.703966] env[67893]: INFO nova.compute.manager [-] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] Took 0.03 seconds to deallocate network for instance. [ 2449.784671] env[67893]: DEBUG oslo_concurrency.lockutils [None req-dfdbc632-6bd6-4c5e-af06-f77846119324 tempest-DeleteServersTestJSON-726721183 tempest-DeleteServersTestJSON-726721183-project-member] Lock "72410dc2-74d9-4d59-bdd1-ad45b01c482b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.154s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2449.785916] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "72410dc2-74d9-4d59-bdd1-ad45b01c482b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 34.629s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2449.786151] env[67893]: INFO nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 72410dc2-74d9-4d59-bdd1-ad45b01c482b] During sync_power_state the instance has a pending task (deleting). Skip. [ 2449.786332] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "72410dc2-74d9-4d59-bdd1-ad45b01c482b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2458.891994] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2460.858606] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2462.859205] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2462.859434] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2462.859514] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2462.874871] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2462.875030] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2462.875166] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2462.875292] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 878bdb96-bd90-47ab-904b-ce1d184ecc72] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2462.875442] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2462.875909] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2464.871300] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2465.858598] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2465.858858] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2465.859049] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2468.860157] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2471.402401] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquiring lock "0304da07-c90e-4a36-8f3d-f63d4997a4ba" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2471.402762] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Lock "0304da07-c90e-4a36-8f3d-f63d4997a4ba" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2471.413857] env[67893]: DEBUG nova.compute.manager [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 0304da07-c90e-4a36-8f3d-f63d4997a4ba] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2471.483184] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2471.483425] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2471.484843] env[67893]: INFO nova.compute.claims [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 0304da07-c90e-4a36-8f3d-f63d4997a4ba] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2471.598090] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-117cb9eb-27d9-46b1-9128-a804a9ae5269 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2471.605288] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c739907a-e13e-4e0d-8394-c1bc0b2b0558 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2471.633779] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e9d24a9-98d8-4abb-939d-2ffcd7c08d45 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2471.640758] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f053fe5-6c38-4f4a-bb70-5ac4f411f079 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2471.653576] env[67893]: DEBUG nova.compute.provider_tree [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2471.687159] env[67893]: DEBUG nova.scheduler.client.report [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2471.701962] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.218s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2471.702589] env[67893]: DEBUG nova.compute.manager [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 0304da07-c90e-4a36-8f3d-f63d4997a4ba] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2471.736693] env[67893]: DEBUG nova.compute.utils [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2471.737973] env[67893]: DEBUG nova.compute.manager [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 0304da07-c90e-4a36-8f3d-f63d4997a4ba] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2471.738150] env[67893]: DEBUG nova.network.neutron [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 0304da07-c90e-4a36-8f3d-f63d4997a4ba] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2471.746506] env[67893]: DEBUG nova.compute.manager [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 0304da07-c90e-4a36-8f3d-f63d4997a4ba] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2471.806180] env[67893]: DEBUG nova.policy [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd1016091f2ab4fe69bcf52e8f536bc32', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '88e59a371b0d4dedb303e9b7f6d69b9d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 2471.811646] env[67893]: DEBUG nova.compute.manager [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 0304da07-c90e-4a36-8f3d-f63d4997a4ba] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2471.832565] env[67893]: DEBUG nova.virt.hardware [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2471.832821] env[67893]: DEBUG nova.virt.hardware [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2471.832980] env[67893]: DEBUG nova.virt.hardware [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2471.833176] env[67893]: DEBUG nova.virt.hardware [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2471.833320] env[67893]: DEBUG nova.virt.hardware [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2471.833463] env[67893]: DEBUG nova.virt.hardware [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2471.833663] env[67893]: DEBUG nova.virt.hardware [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2471.833820] env[67893]: DEBUG nova.virt.hardware [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2471.833986] env[67893]: DEBUG nova.virt.hardware [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2471.834162] env[67893]: DEBUG nova.virt.hardware [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2471.834332] env[67893]: DEBUG nova.virt.hardware [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2471.835164] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a707e8e-7f0b-45ed-96d5-a4a8fe9fe730 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2471.842796] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-343ad3e3-57f0-450c-bee4-11246b52a48a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2472.121570] env[67893]: DEBUG nova.network.neutron [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 0304da07-c90e-4a36-8f3d-f63d4997a4ba] Successfully created port: 5ef31de6-25eb-455d-921f-b8e4abf6b4c8 {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2472.648333] env[67893]: DEBUG nova.compute.manager [req-8a2b4764-36cd-4f09-a7ea-b4ee43411915 req-ff47fbd6-5394-448a-a964-d259b6e462be service nova] [instance: 0304da07-c90e-4a36-8f3d-f63d4997a4ba] Received event network-vif-plugged-5ef31de6-25eb-455d-921f-b8e4abf6b4c8 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2472.648619] env[67893]: DEBUG oslo_concurrency.lockutils [req-8a2b4764-36cd-4f09-a7ea-b4ee43411915 req-ff47fbd6-5394-448a-a964-d259b6e462be service nova] Acquiring lock "0304da07-c90e-4a36-8f3d-f63d4997a4ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2472.648752] env[67893]: DEBUG oslo_concurrency.lockutils [req-8a2b4764-36cd-4f09-a7ea-b4ee43411915 req-ff47fbd6-5394-448a-a964-d259b6e462be service nova] Lock "0304da07-c90e-4a36-8f3d-f63d4997a4ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2472.648952] env[67893]: DEBUG oslo_concurrency.lockutils [req-8a2b4764-36cd-4f09-a7ea-b4ee43411915 req-ff47fbd6-5394-448a-a964-d259b6e462be service nova] Lock "0304da07-c90e-4a36-8f3d-f63d4997a4ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2472.649144] env[67893]: DEBUG nova.compute.manager [req-8a2b4764-36cd-4f09-a7ea-b4ee43411915 req-ff47fbd6-5394-448a-a964-d259b6e462be service nova] [instance: 0304da07-c90e-4a36-8f3d-f63d4997a4ba] No waiting events found dispatching network-vif-plugged-5ef31de6-25eb-455d-921f-b8e4abf6b4c8 {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2472.649306] env[67893]: WARNING nova.compute.manager [req-8a2b4764-36cd-4f09-a7ea-b4ee43411915 req-ff47fbd6-5394-448a-a964-d259b6e462be service nova] [instance: 0304da07-c90e-4a36-8f3d-f63d4997a4ba] Received unexpected event network-vif-plugged-5ef31de6-25eb-455d-921f-b8e4abf6b4c8 for instance with vm_state building and task_state spawning. [ 2472.722878] env[67893]: DEBUG nova.network.neutron [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 0304da07-c90e-4a36-8f3d-f63d4997a4ba] Successfully updated port: 5ef31de6-25eb-455d-921f-b8e4abf6b4c8 {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2472.734113] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquiring lock "refresh_cache-0304da07-c90e-4a36-8f3d-f63d4997a4ba" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2472.734269] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquired lock "refresh_cache-0304da07-c90e-4a36-8f3d-f63d4997a4ba" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2472.734419] env[67893]: DEBUG nova.network.neutron [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 0304da07-c90e-4a36-8f3d-f63d4997a4ba] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2472.770920] env[67893]: DEBUG nova.network.neutron [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 0304da07-c90e-4a36-8f3d-f63d4997a4ba] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2472.858497] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2472.869848] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2472.870098] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2472.870269] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2472.870419] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2472.871491] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ff03d43-13c4-4a5f-848a-2a67466481e0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2472.880619] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b93ec87-555d-469c-9e43-1525746440e5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2472.896142] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c51a65c-8c37-49f2-aafd-2b8d8704dbb2 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2472.902527] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-372d5c9f-4395-49a5-88df-f0e14ee19ba3 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2472.932051] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180989MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2472.932176] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2472.932373] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2472.937760] env[67893]: DEBUG nova.network.neutron [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 0304da07-c90e-4a36-8f3d-f63d4997a4ba] Updating instance_info_cache with network_info: [{"id": "5ef31de6-25eb-455d-921f-b8e4abf6b4c8", "address": "fa:16:3e:67:a9:63", "network": {"id": "f5f37611-ef93-4a5d-8b1c-169af83eb7a6", "bridge": "br-int", "label": "tempest-ServersTestJSON-1801901943-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "88e59a371b0d4dedb303e9b7f6d69b9d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8c3e2368-4a35-4aa5-9135-23daedbbf9ef", "external-id": "nsx-vlan-transportzone-125", "segmentation_id": 125, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5ef31de6-25", "ovs_interfaceid": "5ef31de6-25eb-455d-921f-b8e4abf6b4c8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2472.954979] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Releasing lock "refresh_cache-0304da07-c90e-4a36-8f3d-f63d4997a4ba" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2472.955310] env[67893]: DEBUG nova.compute.manager [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 0304da07-c90e-4a36-8f3d-f63d4997a4ba] Instance network_info: |[{"id": "5ef31de6-25eb-455d-921f-b8e4abf6b4c8", "address": "fa:16:3e:67:a9:63", "network": {"id": "f5f37611-ef93-4a5d-8b1c-169af83eb7a6", "bridge": "br-int", "label": "tempest-ServersTestJSON-1801901943-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "88e59a371b0d4dedb303e9b7f6d69b9d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8c3e2368-4a35-4aa5-9135-23daedbbf9ef", "external-id": "nsx-vlan-transportzone-125", "segmentation_id": 125, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5ef31de6-25", "ovs_interfaceid": "5ef31de6-25eb-455d-921f-b8e4abf6b4c8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2472.955709] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 0304da07-c90e-4a36-8f3d-f63d4997a4ba] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:67:a9:63', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8c3e2368-4a35-4aa5-9135-23daedbbf9ef', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5ef31de6-25eb-455d-921f-b8e4abf6b4c8', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2472.965213] env[67893]: DEBUG oslo.service.loopingcall [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2472.965667] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0304da07-c90e-4a36-8f3d-f63d4997a4ba] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2472.966242] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-67842b02-5945-4e81-904c-4295329d628c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2472.989685] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2472.989685] env[67893]: value = "task-3455526" [ 2472.989685] env[67893]: _type = "Task" [ 2472.989685] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2473.001708] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455526, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2473.014414] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a5151a22-4174-4f66-a83a-55a0dd01c407 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2473.014569] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a71c2ee1-0286-4098-afca-f7666469a95f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2473.014699] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance d0f623f5-88e9-4806-8f30-584d277ba5fe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2473.014822] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 878bdb96-bd90-47ab-904b-ce1d184ecc72 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2473.014942] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 0304da07-c90e-4a36-8f3d-f63d4997a4ba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2473.015133] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 5 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2473.015274] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1152MB phys_disk=200GB used_disk=5GB total_vcpus=48 used_vcpus=5 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2473.092187] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6e755d5-365e-416f-87cc-c926dbe2581f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2473.102025] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7a7f712-9227-47bd-8d67-38ada7f5ad0a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2473.133118] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22732c28-dfe7-49ca-8016-469b9ee70303 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2473.140998] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc845f8d-9810-430b-8b8f-a267446fa5f2 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2473.154996] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2473.164916] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2473.177987] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2473.178202] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.246s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2473.501022] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455526, 'name': CreateVM_Task, 'duration_secs': 0.310529} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2473.501022] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0304da07-c90e-4a36-8f3d-f63d4997a4ba] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2473.501022] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2473.501022] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2473.501022] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2473.501265] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f7d350ce-123b-45a3-a942-12e68c405b07 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2473.507093] env[67893]: DEBUG oslo_vmware.api [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Waiting for the task: (returnval){ [ 2473.507093] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52a3d2b5-74da-b8ea-ec75-febd0678bee4" [ 2473.507093] env[67893]: _type = "Task" [ 2473.507093] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2473.512715] env[67893]: DEBUG oslo_vmware.api [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52a3d2b5-74da-b8ea-ec75-febd0678bee4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2474.015879] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2474.016425] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: 0304da07-c90e-4a36-8f3d-f63d4997a4ba] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2474.016577] env[67893]: DEBUG oslo_concurrency.lockutils [None req-b1754eb7-192f-4cf3-88ba-f665612e88fd tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2474.678551] env[67893]: DEBUG nova.compute.manager [req-05af9cf2-efe1-4ac1-8245-fee277dc4f45 req-90254199-9350-4181-810e-988af1f11773 service nova] [instance: 0304da07-c90e-4a36-8f3d-f63d4997a4ba] Received event network-changed-5ef31de6-25eb-455d-921f-b8e4abf6b4c8 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2474.678727] env[67893]: DEBUG nova.compute.manager [req-05af9cf2-efe1-4ac1-8245-fee277dc4f45 req-90254199-9350-4181-810e-988af1f11773 service nova] [instance: 0304da07-c90e-4a36-8f3d-f63d4997a4ba] Refreshing instance network info cache due to event network-changed-5ef31de6-25eb-455d-921f-b8e4abf6b4c8. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2474.678955] env[67893]: DEBUG oslo_concurrency.lockutils [req-05af9cf2-efe1-4ac1-8245-fee277dc4f45 req-90254199-9350-4181-810e-988af1f11773 service nova] Acquiring lock "refresh_cache-0304da07-c90e-4a36-8f3d-f63d4997a4ba" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2474.679235] env[67893]: DEBUG oslo_concurrency.lockutils [req-05af9cf2-efe1-4ac1-8245-fee277dc4f45 req-90254199-9350-4181-810e-988af1f11773 service nova] Acquired lock "refresh_cache-0304da07-c90e-4a36-8f3d-f63d4997a4ba" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2474.679417] env[67893]: DEBUG nova.network.neutron [req-05af9cf2-efe1-4ac1-8245-fee277dc4f45 req-90254199-9350-4181-810e-988af1f11773 service nova] [instance: 0304da07-c90e-4a36-8f3d-f63d4997a4ba] Refreshing network info cache for port 5ef31de6-25eb-455d-921f-b8e4abf6b4c8 {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2475.066280] env[67893]: DEBUG nova.network.neutron [req-05af9cf2-efe1-4ac1-8245-fee277dc4f45 req-90254199-9350-4181-810e-988af1f11773 service nova] [instance: 0304da07-c90e-4a36-8f3d-f63d4997a4ba] Updated VIF entry in instance network info cache for port 5ef31de6-25eb-455d-921f-b8e4abf6b4c8. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2475.066660] env[67893]: DEBUG nova.network.neutron [req-05af9cf2-efe1-4ac1-8245-fee277dc4f45 req-90254199-9350-4181-810e-988af1f11773 service nova] [instance: 0304da07-c90e-4a36-8f3d-f63d4997a4ba] Updating instance_info_cache with network_info: [{"id": "5ef31de6-25eb-455d-921f-b8e4abf6b4c8", "address": "fa:16:3e:67:a9:63", "network": {"id": "f5f37611-ef93-4a5d-8b1c-169af83eb7a6", "bridge": "br-int", "label": "tempest-ServersTestJSON-1801901943-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "88e59a371b0d4dedb303e9b7f6d69b9d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8c3e2368-4a35-4aa5-9135-23daedbbf9ef", "external-id": "nsx-vlan-transportzone-125", "segmentation_id": 125, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5ef31de6-25", "ovs_interfaceid": "5ef31de6-25eb-455d-921f-b8e4abf6b4c8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2475.075957] env[67893]: DEBUG oslo_concurrency.lockutils [req-05af9cf2-efe1-4ac1-8245-fee277dc4f45 req-90254199-9350-4181-810e-988af1f11773 service nova] Releasing lock "refresh_cache-0304da07-c90e-4a36-8f3d-f63d4997a4ba" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2495.228213] env[67893]: WARNING oslo_vmware.rw_handles [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2495.228213] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2495.228213] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2495.228213] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2495.228213] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2495.228213] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 2495.228213] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2495.228213] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2495.228213] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2495.228213] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2495.228213] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2495.228213] env[67893]: ERROR oslo_vmware.rw_handles [ 2495.229194] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/81b410a4-6ebb-4bc9-aef0-fb410c04260d/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2495.230894] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2495.231179] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Copying Virtual Disk [datastore1] vmware_temp/81b410a4-6ebb-4bc9-aef0-fb410c04260d/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/81b410a4-6ebb-4bc9-aef0-fb410c04260d/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2495.231495] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-7e708a4c-0e7c-434c-9467-d6201d821437 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2495.239080] env[67893]: DEBUG oslo_vmware.api [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Waiting for the task: (returnval){ [ 2495.239080] env[67893]: value = "task-3455527" [ 2495.239080] env[67893]: _type = "Task" [ 2495.239080] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2495.247160] env[67893]: DEBUG oslo_vmware.api [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Task: {'id': task-3455527, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2495.750058] env[67893]: DEBUG oslo_vmware.exceptions [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2495.750058] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2495.750545] env[67893]: ERROR nova.compute.manager [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2495.750545] env[67893]: Faults: ['InvalidArgument'] [ 2495.750545] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Traceback (most recent call last): [ 2495.750545] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2495.750545] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] yield resources [ 2495.750545] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2495.750545] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] self.driver.spawn(context, instance, image_meta, [ 2495.750545] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2495.750545] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2495.750545] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2495.750545] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] self._fetch_image_if_missing(context, vi) [ 2495.750545] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2495.750545] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] image_cache(vi, tmp_image_ds_loc) [ 2495.750545] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2495.750545] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] vm_util.copy_virtual_disk( [ 2495.750545] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2495.750545] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] session._wait_for_task(vmdk_copy_task) [ 2495.750545] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2495.750545] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] return self.wait_for_task(task_ref) [ 2495.750545] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2495.750545] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] return evt.wait() [ 2495.750545] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2495.750545] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] result = hub.switch() [ 2495.750545] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2495.750545] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] return self.greenlet.switch() [ 2495.750545] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2495.750545] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] self.f(*self.args, **self.kw) [ 2495.750545] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2495.750545] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] raise exceptions.translate_fault(task_info.error) [ 2495.750545] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2495.750545] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Faults: ['InvalidArgument'] [ 2495.750545] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] [ 2495.751834] env[67893]: INFO nova.compute.manager [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Terminating instance [ 2495.752403] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2495.752612] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2495.752856] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-26fdd538-d7be-4cf5-855f-77ac25f63635 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2495.755209] env[67893]: DEBUG nova.compute.manager [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2495.755466] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2495.756237] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4cd5849-1ce2-4c24-804a-59e1d657596e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2495.763204] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2495.763453] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3362ddc3-9d03-41f6-9c74-222f629c81d0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2495.765795] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2495.765970] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2495.766977] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-08990824-6130-4a6d-8368-2c3e8736dfaa {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2495.771806] env[67893]: DEBUG oslo_vmware.api [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Waiting for the task: (returnval){ [ 2495.771806] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52cd738b-60bb-a972-5f0f-da83e942afc8" [ 2495.771806] env[67893]: _type = "Task" [ 2495.771806] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2495.781094] env[67893]: DEBUG oslo_vmware.api [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52cd738b-60bb-a972-5f0f-da83e942afc8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2495.833393] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2495.833654] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2495.833863] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Deleting the datastore file [datastore1] a5151a22-4174-4f66-a83a-55a0dd01c407 {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2495.834188] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-47b1ab90-eac2-4d30-a865-c4aad98fb729 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2495.840957] env[67893]: DEBUG oslo_vmware.api [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Waiting for the task: (returnval){ [ 2495.840957] env[67893]: value = "task-3455529" [ 2495.840957] env[67893]: _type = "Task" [ 2495.840957] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2495.851856] env[67893]: DEBUG oslo_vmware.api [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Task: {'id': task-3455529, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2496.281606] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2496.282071] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Creating directory with path [datastore1] vmware_temp/c75578ed-8c5a-44f8-9a47-bab9a1b4a143/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2496.282071] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-abbde591-4a97-4313-bd30-a7e76eaf3e13 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2496.293194] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Created directory with path [datastore1] vmware_temp/c75578ed-8c5a-44f8-9a47-bab9a1b4a143/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2496.293379] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Fetch image to [datastore1] vmware_temp/c75578ed-8c5a-44f8-9a47-bab9a1b4a143/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2496.293544] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/c75578ed-8c5a-44f8-9a47-bab9a1b4a143/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2496.294237] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89f45aab-9ed5-46a0-9adc-a77fa71832f2 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2496.300322] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49568d25-245b-4302-b400-81522c27d7f5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2496.308878] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59e765f0-62ee-44ba-abc9-ec7e3f29e691 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2496.337942] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-431e3553-01b3-4589-b25e-74020391a98f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2496.345996] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-3e1b274e-febd-4223-b38f-078b6853c2f8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2496.350264] env[67893]: DEBUG oslo_vmware.api [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Task: {'id': task-3455529, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065735} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2496.350824] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2496.351028] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2496.351210] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2496.351385] env[67893]: INFO nova.compute.manager [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2496.353511] env[67893]: DEBUG nova.compute.claims [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2496.353700] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2496.353909] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2496.368406] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2496.421716] env[67893]: DEBUG oslo_vmware.rw_handles [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c75578ed-8c5a-44f8-9a47-bab9a1b4a143/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2496.482797] env[67893]: DEBUG oslo_vmware.rw_handles [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2496.482949] env[67893]: DEBUG oslo_vmware.rw_handles [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c75578ed-8c5a-44f8-9a47-bab9a1b4a143/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2496.527881] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf599163-da14-469f-9023-9a3a45901202 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2496.534779] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc5d32e7-d359-46b6-8fcc-53c16ea734d7 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2496.565473] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc69e758-9d5e-4337-94ee-be3d8833711f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2496.572418] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14529e08-dc84-4765-ac48-d5f5436ad027 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2496.585360] env[67893]: DEBUG nova.compute.provider_tree [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2496.593859] env[67893]: DEBUG nova.scheduler.client.report [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2496.607147] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.253s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2496.607694] env[67893]: ERROR nova.compute.manager [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2496.607694] env[67893]: Faults: ['InvalidArgument'] [ 2496.607694] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Traceback (most recent call last): [ 2496.607694] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2496.607694] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] self.driver.spawn(context, instance, image_meta, [ 2496.607694] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2496.607694] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2496.607694] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2496.607694] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] self._fetch_image_if_missing(context, vi) [ 2496.607694] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2496.607694] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] image_cache(vi, tmp_image_ds_loc) [ 2496.607694] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2496.607694] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] vm_util.copy_virtual_disk( [ 2496.607694] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2496.607694] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] session._wait_for_task(vmdk_copy_task) [ 2496.607694] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2496.607694] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] return self.wait_for_task(task_ref) [ 2496.607694] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2496.607694] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] return evt.wait() [ 2496.607694] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2496.607694] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] result = hub.switch() [ 2496.607694] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2496.607694] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] return self.greenlet.switch() [ 2496.607694] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2496.607694] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] self.f(*self.args, **self.kw) [ 2496.607694] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2496.607694] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] raise exceptions.translate_fault(task_info.error) [ 2496.607694] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2496.607694] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Faults: ['InvalidArgument'] [ 2496.607694] env[67893]: ERROR nova.compute.manager [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] [ 2496.608656] env[67893]: DEBUG nova.compute.utils [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2496.609757] env[67893]: DEBUG nova.compute.manager [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Build of instance a5151a22-4174-4f66-a83a-55a0dd01c407 was re-scheduled: A specified parameter was not correct: fileType [ 2496.609757] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2496.610123] env[67893]: DEBUG nova.compute.manager [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2496.610298] env[67893]: DEBUG nova.compute.manager [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2496.610463] env[67893]: DEBUG nova.compute.manager [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2496.610652] env[67893]: DEBUG nova.network.neutron [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2497.035166] env[67893]: DEBUG nova.network.neutron [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2497.047330] env[67893]: INFO nova.compute.manager [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Took 0.44 seconds to deallocate network for instance. [ 2497.146756] env[67893]: INFO nova.scheduler.client.report [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Deleted allocations for instance a5151a22-4174-4f66-a83a-55a0dd01c407 [ 2497.169543] env[67893]: DEBUG oslo_concurrency.lockutils [None req-7d439233-f624-4c9b-8df7-86f0b61df719 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Lock "a5151a22-4174-4f66-a83a-55a0dd01c407" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 626.676s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2497.169825] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ed988f03-4bec-438c-8228-f9c2afbcfed6 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Lock "a5151a22-4174-4f66-a83a-55a0dd01c407" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 430.824s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2497.170058] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ed988f03-4bec-438c-8228-f9c2afbcfed6 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Acquiring lock "a5151a22-4174-4f66-a83a-55a0dd01c407-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2497.170269] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ed988f03-4bec-438c-8228-f9c2afbcfed6 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Lock "a5151a22-4174-4f66-a83a-55a0dd01c407-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2497.170439] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ed988f03-4bec-438c-8228-f9c2afbcfed6 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Lock "a5151a22-4174-4f66-a83a-55a0dd01c407-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2497.172352] env[67893]: INFO nova.compute.manager [None req-ed988f03-4bec-438c-8228-f9c2afbcfed6 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Terminating instance [ 2497.174049] env[67893]: DEBUG nova.compute.manager [None req-ed988f03-4bec-438c-8228-f9c2afbcfed6 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2497.174244] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-ed988f03-4bec-438c-8228-f9c2afbcfed6 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2497.174721] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-95b513e1-8fdf-4f53-ba99-a9afe75b7706 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2497.183678] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d1e0b12-4998-46e6-a727-d642bcbabe05 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2497.211138] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-ed988f03-4bec-438c-8228-f9c2afbcfed6 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a5151a22-4174-4f66-a83a-55a0dd01c407 could not be found. [ 2497.211138] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-ed988f03-4bec-438c-8228-f9c2afbcfed6 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2497.211284] env[67893]: INFO nova.compute.manager [None req-ed988f03-4bec-438c-8228-f9c2afbcfed6 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2497.211426] env[67893]: DEBUG oslo.service.loopingcall [None req-ed988f03-4bec-438c-8228-f9c2afbcfed6 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2497.211939] env[67893]: DEBUG nova.compute.manager [-] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2497.212051] env[67893]: DEBUG nova.network.neutron [-] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2497.237138] env[67893]: DEBUG nova.network.neutron [-] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2497.245591] env[67893]: INFO nova.compute.manager [-] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] Took 0.03 seconds to deallocate network for instance. [ 2497.330820] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ed988f03-4bec-438c-8228-f9c2afbcfed6 tempest-ImagesOneServerTestJSON-1610803408 tempest-ImagesOneServerTestJSON-1610803408-project-member] Lock "a5151a22-4174-4f66-a83a-55a0dd01c407" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.161s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2497.331847] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "a5151a22-4174-4f66-a83a-55a0dd01c407" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 82.174s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2497.331847] env[67893]: INFO nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: a5151a22-4174-4f66-a83a-55a0dd01c407] During sync_power_state the instance has a pending task (deleting). Skip. [ 2497.332083] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "a5151a22-4174-4f66-a83a-55a0dd01c407" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2521.178733] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2521.179226] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2523.860044] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2523.860435] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Starting heal instance info cache {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2523.860435] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Rebuilding the list of instances to heal {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2523.875719] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2523.875876] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2523.875987] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 878bdb96-bd90-47ab-904b-ce1d184ecc72] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2523.876134] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: 0304da07-c90e-4a36-8f3d-f63d4997a4ba] Skipping network cache update for instance because it is Building. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2523.876263] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Didn't find any instances for network info cache update. {{(pid=67893) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2523.876752] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2525.858987] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2525.858987] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2525.860203] env[67893]: DEBUG nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67893) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2526.859638] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2528.859025] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2533.856073] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2534.859288] env[67893]: DEBUG oslo_service.periodic_task [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Running periodic task ComputeManager.update_available_resource {{(pid=67893) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2534.871200] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2534.871432] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2534.871597] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2534.871752] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67893) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2534.872874] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18df5974-04e5-424d-8796-acd00b2436a5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2534.881824] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3dcfba90-3e58-4bb4-a035-c786bb7785f0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2534.895870] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4d4a1b5-7e1d-491b-b4f4-fa32b94e43ee {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2534.901849] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01d70f52-96eb-482e-a708-d432cd0aea8f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2534.930921] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180991MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67893) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2534.931076] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2534.931263] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2534.983027] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance a71c2ee1-0286-4098-afca-f7666469a95f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2534.983200] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance d0f623f5-88e9-4806-8f30-584d277ba5fe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2534.983332] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 878bdb96-bd90-47ab-904b-ce1d184ecc72 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2534.983454] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Instance 0304da07-c90e-4a36-8f3d-f63d4997a4ba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67893) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2534.983627] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Total usable vcpus: 48, total allocated vcpus: 4 {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2534.983767] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1024MB phys_disk=200GB used_disk=4GB total_vcpus=48 used_vcpus=4 pci_stats=[] {{(pid=67893) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2535.038925] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca6a954e-e0a6-4ed0-bccb-e1ad59de21f0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2535.045958] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa959510-e380-4f75-843b-ab677096e7c5 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2535.074302] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38cd8d6a-6d6e-430c-96c6-6d1dad277dec {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2535.081471] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3524efb1-7987-4bdd-a02e-72b10991630a {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2535.095306] env[67893]: DEBUG nova.compute.provider_tree [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2535.103652] env[67893]: DEBUG nova.scheduler.client.report [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2535.116240] env[67893]: DEBUG nova.compute.resource_tracker [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67893) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2535.116414] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.185s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2544.126253] env[67893]: WARNING oslo_vmware.rw_handles [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2544.126253] env[67893]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2544.126253] env[67893]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2544.126253] env[67893]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2544.126253] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2544.126253] env[67893]: ERROR oslo_vmware.rw_handles response.begin() [ 2544.126253] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2544.126253] env[67893]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2544.126253] env[67893]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2544.126253] env[67893]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2544.126253] env[67893]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2544.126253] env[67893]: ERROR oslo_vmware.rw_handles [ 2544.127348] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Downloaded image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to vmware_temp/c75578ed-8c5a-44f8-9a47-bab9a1b4a143/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2544.128925] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Caching image {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2544.129229] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Copying Virtual Disk [datastore1] vmware_temp/c75578ed-8c5a-44f8-9a47-bab9a1b4a143/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk to [datastore1] vmware_temp/c75578ed-8c5a-44f8-9a47-bab9a1b4a143/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk {{(pid=67893) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2544.129519] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-e8ff0d1d-f951-4714-8e95-c99f0affd261 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2544.136871] env[67893]: DEBUG oslo_vmware.api [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Waiting for the task: (returnval){ [ 2544.136871] env[67893]: value = "task-3455530" [ 2544.136871] env[67893]: _type = "Task" [ 2544.136871] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2544.144452] env[67893]: DEBUG oslo_vmware.api [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Task: {'id': task-3455530, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2544.647591] env[67893]: DEBUG oslo_vmware.exceptions [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Fault InvalidArgument not matched. {{(pid=67893) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2544.647879] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2544.648535] env[67893]: ERROR nova.compute.manager [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2544.648535] env[67893]: Faults: ['InvalidArgument'] [ 2544.648535] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Traceback (most recent call last): [ 2544.648535] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2544.648535] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] yield resources [ 2544.648535] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2544.648535] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] self.driver.spawn(context, instance, image_meta, [ 2544.648535] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2544.648535] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2544.648535] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2544.648535] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] self._fetch_image_if_missing(context, vi) [ 2544.648535] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2544.648535] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] image_cache(vi, tmp_image_ds_loc) [ 2544.648535] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2544.648535] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] vm_util.copy_virtual_disk( [ 2544.648535] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2544.648535] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] session._wait_for_task(vmdk_copy_task) [ 2544.648535] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2544.648535] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] return self.wait_for_task(task_ref) [ 2544.648535] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2544.648535] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] return evt.wait() [ 2544.648535] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2544.648535] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] result = hub.switch() [ 2544.648535] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2544.648535] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] return self.greenlet.switch() [ 2544.648535] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2544.648535] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] self.f(*self.args, **self.kw) [ 2544.648535] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2544.648535] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] raise exceptions.translate_fault(task_info.error) [ 2544.648535] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2544.648535] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Faults: ['InvalidArgument'] [ 2544.648535] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] [ 2544.649411] env[67893]: INFO nova.compute.manager [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Terminating instance [ 2544.651755] env[67893]: DEBUG nova.compute.manager [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2544.652118] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2544.652412] env[67893]: DEBUG oslo_concurrency.lockutils [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2544.652676] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2544.653768] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c49d8892-8da9-4945-9cad-366f2fc1b76f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2544.656723] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6a1da943-44a4-4459-a674-b6cb81ff2f28 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2544.662380] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Unregistering the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2544.662592] env[67893]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-594fccc1-0c53-4e85-ae57-2b5dace8525e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2544.664640] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2544.664814] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67893) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2544.665746] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6f0342d7-7d7e-4f09-ad54-10013cf65d45 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2544.670407] env[67893]: DEBUG oslo_vmware.api [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Waiting for the task: (returnval){ [ 2544.670407] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52501698-6f64-5292-4190-6cf1c9e42b0f" [ 2544.670407] env[67893]: _type = "Task" [ 2544.670407] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2544.677138] env[67893]: DEBUG oslo_vmware.api [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52501698-6f64-5292-4190-6cf1c9e42b0f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2544.731980] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Unregistered the VM {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2544.732290] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Deleting contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2544.732502] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Deleting the datastore file [datastore1] a71c2ee1-0286-4098-afca-f7666469a95f {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2544.732765] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-145fc407-c50a-46ee-8f4a-06254f936317 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2544.739149] env[67893]: DEBUG oslo_vmware.api [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Waiting for the task: (returnval){ [ 2544.739149] env[67893]: value = "task-3455532" [ 2544.739149] env[67893]: _type = "Task" [ 2544.739149] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2544.746853] env[67893]: DEBUG oslo_vmware.api [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Task: {'id': task-3455532, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2545.181212] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Preparing fetch location {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2545.181647] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Creating directory with path [datastore1] vmware_temp/5e669989-475f-48e6-88b8-f015cb776b00/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2545.181892] env[67893]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-11c98ec6-ab1f-4451-950a-ac189962d160 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2545.193208] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Created directory with path [datastore1] vmware_temp/5e669989-475f-48e6-88b8-f015cb776b00/c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2545.193458] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Fetch image to [datastore1] vmware_temp/5e669989-475f-48e6-88b8-f015cb776b00/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2545.193650] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to [datastore1] vmware_temp/5e669989-475f-48e6-88b8-f015cb776b00/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk on the data store datastore1 {{(pid=67893) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2545.194392] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d04de390-7e3a-4456-a6ed-7d9e672346b0 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2545.200940] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96245b24-a4d0-4c4e-b437-a6f8f9e0d478 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2545.209566] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7237659-cd3d-44c1-a807-97bd99635575 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2545.243418] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe02b8e9-c71f-4ea5-87f6-f0ab90b34e6e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2545.251505] env[67893]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-9c4f53a4-de94-49b5-976d-7e0835d20bac {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2545.253176] env[67893]: DEBUG oslo_vmware.api [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Task: {'id': task-3455532, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076218} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2545.253409] env[67893]: DEBUG nova.virt.vmwareapi.ds_util [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Deleted the datastore file {{(pid=67893) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2545.253585] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Deleted contents of the VM from datastore datastore1 {{(pid=67893) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2545.253847] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2545.254051] env[67893]: INFO nova.compute.manager [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2545.256053] env[67893]: DEBUG nova.compute.claims [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Aborting claim: {{(pid=67893) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2545.256231] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2545.256443] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2545.279093] env[67893]: DEBUG nova.virt.vmwareapi.images [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: d0f623f5-88e9-4806-8f30-584d277ba5fe] Downloading image file data c08ce966-3dfb-4888-bf12-9a3e73350a20 to the data store datastore1 {{(pid=67893) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2545.339734] env[67893]: DEBUG oslo_vmware.rw_handles [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5e669989-475f-48e6-88b8-f015cb776b00/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2545.400234] env[67893]: DEBUG oslo_vmware.rw_handles [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Completed reading data from the image iterator. {{(pid=67893) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2545.400455] env[67893]: DEBUG oslo_vmware.rw_handles [None req-22129d49-7559-4c4c-a83d-fb715aef9a75 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5e669989-475f-48e6-88b8-f015cb776b00/c08ce966-3dfb-4888-bf12-9a3e73350a20/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67893) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2545.422097] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ab7eb96-7267-4ada-b846-aff8610aea84 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2545.429184] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-583b0e83-72fe-45cc-b1bc-64b7ea838e72 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2545.458199] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9700d526-d6e5-4a0f-9adf-7b63dd69e715 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2545.464995] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60892f0d-679c-467f-938a-857a7bdc18b1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2545.477552] env[67893]: DEBUG nova.compute.provider_tree [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2545.485949] env[67893]: DEBUG nova.scheduler.client.report [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2545.499571] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.243s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2545.500167] env[67893]: ERROR nova.compute.manager [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2545.500167] env[67893]: Faults: ['InvalidArgument'] [ 2545.500167] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Traceback (most recent call last): [ 2545.500167] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2545.500167] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] self.driver.spawn(context, instance, image_meta, [ 2545.500167] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2545.500167] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2545.500167] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2545.500167] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] self._fetch_image_if_missing(context, vi) [ 2545.500167] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2545.500167] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] image_cache(vi, tmp_image_ds_loc) [ 2545.500167] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2545.500167] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] vm_util.copy_virtual_disk( [ 2545.500167] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2545.500167] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] session._wait_for_task(vmdk_copy_task) [ 2545.500167] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2545.500167] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] return self.wait_for_task(task_ref) [ 2545.500167] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2545.500167] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] return evt.wait() [ 2545.500167] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2545.500167] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] result = hub.switch() [ 2545.500167] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2545.500167] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] return self.greenlet.switch() [ 2545.500167] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2545.500167] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] self.f(*self.args, **self.kw) [ 2545.500167] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2545.500167] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] raise exceptions.translate_fault(task_info.error) [ 2545.500167] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2545.500167] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Faults: ['InvalidArgument'] [ 2545.500167] env[67893]: ERROR nova.compute.manager [instance: a71c2ee1-0286-4098-afca-f7666469a95f] [ 2545.501181] env[67893]: DEBUG nova.compute.utils [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] VimFaultException {{(pid=67893) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2545.502687] env[67893]: DEBUG nova.compute.manager [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Build of instance a71c2ee1-0286-4098-afca-f7666469a95f was re-scheduled: A specified parameter was not correct: fileType [ 2545.502687] env[67893]: Faults: ['InvalidArgument'] {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2545.503082] env[67893]: DEBUG nova.compute.manager [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Unplugging VIFs for instance {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2545.503261] env[67893]: DEBUG nova.compute.manager [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67893) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2545.503434] env[67893]: DEBUG nova.compute.manager [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2545.503596] env[67893]: DEBUG nova.network.neutron [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2546.027037] env[67893]: DEBUG nova.network.neutron [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2546.040042] env[67893]: INFO nova.compute.manager [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Took 0.54 seconds to deallocate network for instance. [ 2546.141828] env[67893]: INFO nova.scheduler.client.report [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Deleted allocations for instance a71c2ee1-0286-4098-afca-f7666469a95f [ 2546.163345] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ca2df50f-05fa-404c-87f3-4785643970ad tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Lock "a71c2ee1-0286-4098-afca-f7666469a95f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 467.402s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2546.163614] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ff79d8bf-c1b1-4927-a95a-b861689812be tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Lock "a71c2ee1-0286-4098-afca-f7666469a95f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 271.534s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2546.163834] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ff79d8bf-c1b1-4927-a95a-b861689812be tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Acquiring lock "a71c2ee1-0286-4098-afca-f7666469a95f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2546.164091] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ff79d8bf-c1b1-4927-a95a-b861689812be tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Lock "a71c2ee1-0286-4098-afca-f7666469a95f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2546.164285] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ff79d8bf-c1b1-4927-a95a-b861689812be tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Lock "a71c2ee1-0286-4098-afca-f7666469a95f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2546.166793] env[67893]: INFO nova.compute.manager [None req-ff79d8bf-c1b1-4927-a95a-b861689812be tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Terminating instance [ 2546.168773] env[67893]: DEBUG nova.compute.manager [None req-ff79d8bf-c1b1-4927-a95a-b861689812be tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Start destroying the instance on the hypervisor. {{(pid=67893) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2546.168773] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-ff79d8bf-c1b1-4927-a95a-b861689812be tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Destroying instance {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2546.169483] env[67893]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6c2dc225-3304-436a-8e64-6289aa65207c {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2546.177860] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11228921-eae4-49cc-af6a-deb11db1ee8d {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2546.202318] env[67893]: WARNING nova.virt.vmwareapi.vmops [None req-ff79d8bf-c1b1-4927-a95a-b861689812be tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a71c2ee1-0286-4098-afca-f7666469a95f could not be found. [ 2546.202588] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-ff79d8bf-c1b1-4927-a95a-b861689812be tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Instance destroyed {{(pid=67893) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2546.202718] env[67893]: INFO nova.compute.manager [None req-ff79d8bf-c1b1-4927-a95a-b861689812be tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Took 0.03 seconds to destroy the instance on the hypervisor. [ 2546.202955] env[67893]: DEBUG oslo.service.loopingcall [None req-ff79d8bf-c1b1-4927-a95a-b861689812be tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2546.203227] env[67893]: DEBUG nova.compute.manager [-] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Deallocating network for instance {{(pid=67893) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2546.203317] env[67893]: DEBUG nova.network.neutron [-] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] deallocate_for_instance() {{(pid=67893) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2546.226240] env[67893]: DEBUG nova.network.neutron [-] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Updating instance_info_cache with network_info: [] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2546.234256] env[67893]: INFO nova.compute.manager [-] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] Took 0.03 seconds to deallocate network for instance. [ 2546.525726] env[67893]: DEBUG oslo_concurrency.lockutils [None req-ff79d8bf-c1b1-4927-a95a-b861689812be tempest-ServersTestJSON-1487459765 tempest-ServersTestJSON-1487459765-project-member] Lock "a71c2ee1-0286-4098-afca-f7666469a95f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.362s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2546.527012] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "a71c2ee1-0286-4098-afca-f7666469a95f" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 131.370s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2546.527347] env[67893]: INFO nova.compute.manager [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] [instance: a71c2ee1-0286-4098-afca-f7666469a95f] During sync_power_state the instance has a pending task (deleting). Skip. [ 2546.527537] env[67893]: DEBUG oslo_concurrency.lockutils [None req-387927e7-27b2-418a-8cd6-c506c4edde6e None None] Lock "a71c2ee1-0286-4098-afca-f7666469a95f" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2560.140687] env[67893]: DEBUG oslo_concurrency.lockutils [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "a5520552-6a53-4351-8687-fc1b18186069" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2560.141026] env[67893]: DEBUG oslo_concurrency.lockutils [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "a5520552-6a53-4351-8687-fc1b18186069" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2560.151845] env[67893]: DEBUG nova.compute.manager [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: a5520552-6a53-4351-8687-fc1b18186069] Starting instance... {{(pid=67893) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2560.201040] env[67893]: DEBUG oslo_concurrency.lockutils [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2560.201237] env[67893]: DEBUG oslo_concurrency.lockutils [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2560.203029] env[67893]: INFO nova.compute.claims [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: a5520552-6a53-4351-8687-fc1b18186069] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2560.327350] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78fff7ee-20cf-4924-961e-cc2a7a3232cb {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2560.335299] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4470dd52-5b9a-4d06-9b17-3a3db08458a1 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2560.365212] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aea345aa-0e24-446b-b3b2-ebf03faddc9f {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2560.372740] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7d05d4a-8e32-4f4f-8496-1414c34e0263 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2560.385646] env[67893]: DEBUG nova.compute.provider_tree [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Inventory has not changed in ProviderTree for provider: 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 {{(pid=67893) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2560.393958] env[67893]: DEBUG nova.scheduler.client.report [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Inventory has not changed for provider 17b8bcc7-ce4b-4d4d-b863-33b2251dfd57 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67893) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2560.407330] env[67893]: DEBUG oslo_concurrency.lockutils [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.206s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2560.407702] env[67893]: DEBUG nova.compute.manager [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: a5520552-6a53-4351-8687-fc1b18186069] Start building networks asynchronously for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2560.437931] env[67893]: DEBUG nova.compute.utils [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Using /dev/sd instead of None {{(pid=67893) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2560.441086] env[67893]: DEBUG nova.compute.manager [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: a5520552-6a53-4351-8687-fc1b18186069] Allocating IP information in the background. {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2560.441086] env[67893]: DEBUG nova.network.neutron [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: a5520552-6a53-4351-8687-fc1b18186069] allocate_for_instance() {{(pid=67893) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2560.447935] env[67893]: DEBUG nova.compute.manager [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: a5520552-6a53-4351-8687-fc1b18186069] Start building block device mappings for instance. {{(pid=67893) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2560.497446] env[67893]: DEBUG nova.policy [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9115f73c22bf4b0e9e5439363832061d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7a19d9bde3814325847c06cec1af09b7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67893) authorize /opt/stack/nova/nova/policy.py:203}} [ 2560.508057] env[67893]: DEBUG nova.compute.manager [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: a5520552-6a53-4351-8687-fc1b18186069] Start spawning the instance on the hypervisor. {{(pid=67893) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2560.530936] env[67893]: DEBUG nova.virt.hardware [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T20:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T20:39:37Z,direct_url=,disk_format='vmdk',id=c08ce966-3dfb-4888-bf12-9a3e73350a20,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='304a5519bb7c46efb34a42749d9cf409',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T20:39:38Z,virtual_size=,visibility=), allow threads: False {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2560.531315] env[67893]: DEBUG nova.virt.hardware [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Flavor limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2560.531473] env[67893]: DEBUG nova.virt.hardware [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Image limits 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2560.531623] env[67893]: DEBUG nova.virt.hardware [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Flavor pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2560.531768] env[67893]: DEBUG nova.virt.hardware [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Image pref 0:0:0 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2560.531912] env[67893]: DEBUG nova.virt.hardware [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67893) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2560.532129] env[67893]: DEBUG nova.virt.hardware [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2560.532289] env[67893]: DEBUG nova.virt.hardware [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2560.532453] env[67893]: DEBUG nova.virt.hardware [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Got 1 possible topologies {{(pid=67893) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2560.532613] env[67893]: DEBUG nova.virt.hardware [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2560.532779] env[67893]: DEBUG nova.virt.hardware [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67893) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2560.533639] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72288e1d-5b85-42c6-abe3-2df3742a9be8 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2560.541541] env[67893]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56b84500-da13-4547-b46d-3d26f5b49b0e {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2560.781052] env[67893]: DEBUG nova.network.neutron [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: a5520552-6a53-4351-8687-fc1b18186069] Successfully created port: 8bc60e34-4303-4949-aa64-cfe300634f32 {{(pid=67893) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2561.326585] env[67893]: DEBUG nova.compute.manager [req-0fbf002f-9f26-4555-b3ef-ac323808bdd8 req-8f3f0f87-39b2-475f-88c1-01303aeeab46 service nova] [instance: a5520552-6a53-4351-8687-fc1b18186069] Received event network-vif-plugged-8bc60e34-4303-4949-aa64-cfe300634f32 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2561.326866] env[67893]: DEBUG oslo_concurrency.lockutils [req-0fbf002f-9f26-4555-b3ef-ac323808bdd8 req-8f3f0f87-39b2-475f-88c1-01303aeeab46 service nova] Acquiring lock "a5520552-6a53-4351-8687-fc1b18186069-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2561.327099] env[67893]: DEBUG oslo_concurrency.lockutils [req-0fbf002f-9f26-4555-b3ef-ac323808bdd8 req-8f3f0f87-39b2-475f-88c1-01303aeeab46 service nova] Lock "a5520552-6a53-4351-8687-fc1b18186069-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2561.327273] env[67893]: DEBUG oslo_concurrency.lockutils [req-0fbf002f-9f26-4555-b3ef-ac323808bdd8 req-8f3f0f87-39b2-475f-88c1-01303aeeab46 service nova] Lock "a5520552-6a53-4351-8687-fc1b18186069-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67893) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2561.327438] env[67893]: DEBUG nova.compute.manager [req-0fbf002f-9f26-4555-b3ef-ac323808bdd8 req-8f3f0f87-39b2-475f-88c1-01303aeeab46 service nova] [instance: a5520552-6a53-4351-8687-fc1b18186069] No waiting events found dispatching network-vif-plugged-8bc60e34-4303-4949-aa64-cfe300634f32 {{(pid=67893) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2561.327596] env[67893]: WARNING nova.compute.manager [req-0fbf002f-9f26-4555-b3ef-ac323808bdd8 req-8f3f0f87-39b2-475f-88c1-01303aeeab46 service nova] [instance: a5520552-6a53-4351-8687-fc1b18186069] Received unexpected event network-vif-plugged-8bc60e34-4303-4949-aa64-cfe300634f32 for instance with vm_state building and task_state spawning. [ 2561.435752] env[67893]: DEBUG nova.network.neutron [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: a5520552-6a53-4351-8687-fc1b18186069] Successfully updated port: 8bc60e34-4303-4949-aa64-cfe300634f32 {{(pid=67893) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2561.447981] env[67893]: DEBUG oslo_concurrency.lockutils [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "refresh_cache-a5520552-6a53-4351-8687-fc1b18186069" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2561.448349] env[67893]: DEBUG oslo_concurrency.lockutils [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquired lock "refresh_cache-a5520552-6a53-4351-8687-fc1b18186069" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2561.448582] env[67893]: DEBUG nova.network.neutron [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: a5520552-6a53-4351-8687-fc1b18186069] Building network info cache for instance {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2561.507689] env[67893]: DEBUG nova.network.neutron [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: a5520552-6a53-4351-8687-fc1b18186069] Instance cache missing network info. {{(pid=67893) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2561.668021] env[67893]: DEBUG nova.network.neutron [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: a5520552-6a53-4351-8687-fc1b18186069] Updating instance_info_cache with network_info: [{"id": "8bc60e34-4303-4949-aa64-cfe300634f32", "address": "fa:16:3e:59:9f:bb", "network": {"id": "b5038471-f3b2-4f1f-b2f9-62effa71f1aa", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1405799721-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7a19d9bde3814325847c06cec1af09b7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "89ef02af-c508-432f-ae29-3a219701d584", "external-id": "nsx-vlan-transportzone-313", "segmentation_id": 313, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8bc60e34-43", "ovs_interfaceid": "8bc60e34-4303-4949-aa64-cfe300634f32", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2561.679289] env[67893]: DEBUG oslo_concurrency.lockutils [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Releasing lock "refresh_cache-a5520552-6a53-4351-8687-fc1b18186069" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2561.679574] env[67893]: DEBUG nova.compute.manager [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: a5520552-6a53-4351-8687-fc1b18186069] Instance network_info: |[{"id": "8bc60e34-4303-4949-aa64-cfe300634f32", "address": "fa:16:3e:59:9f:bb", "network": {"id": "b5038471-f3b2-4f1f-b2f9-62effa71f1aa", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1405799721-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7a19d9bde3814325847c06cec1af09b7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "89ef02af-c508-432f-ae29-3a219701d584", "external-id": "nsx-vlan-transportzone-313", "segmentation_id": 313, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8bc60e34-43", "ovs_interfaceid": "8bc60e34-4303-4949-aa64-cfe300634f32", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67893) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2561.679986] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: a5520552-6a53-4351-8687-fc1b18186069] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:59:9f:bb', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '89ef02af-c508-432f-ae29-3a219701d584', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8bc60e34-4303-4949-aa64-cfe300634f32', 'vif_model': 'vmxnet3'}] {{(pid=67893) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2561.687598] env[67893]: DEBUG oslo.service.loopingcall [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67893) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2561.688087] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a5520552-6a53-4351-8687-fc1b18186069] Creating VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2561.688319] env[67893]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a0159361-fd59-4531-bff7-e03c7e6a3b89 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2561.709242] env[67893]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2561.709242] env[67893]: value = "task-3455533" [ 2561.709242] env[67893]: _type = "Task" [ 2561.709242] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2561.717164] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455533, 'name': CreateVM_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2562.219972] env[67893]: DEBUG oslo_vmware.api [-] Task: {'id': task-3455533, 'name': CreateVM_Task, 'duration_secs': 0.315024} completed successfully. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2562.220162] env[67893]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a5520552-6a53-4351-8687-fc1b18186069] Created VM on the ESX host {{(pid=67893) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2562.226243] env[67893]: DEBUG oslo_concurrency.lockutils [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2562.226408] env[67893]: DEBUG oslo_concurrency.lockutils [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2562.226726] env[67893]: DEBUG oslo_concurrency.lockutils [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2562.227022] env[67893]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e65c857d-12eb-46d9-b957-cc40d4778093 {{(pid=67893) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2562.231519] env[67893]: DEBUG oslo_vmware.api [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Waiting for the task: (returnval){ [ 2562.231519] env[67893]: value = "session[522faadd-04b9-9c79-e1e4-fb685291944d]52a41f47-13b5-e4e0-3140-f0b3f1858a6f" [ 2562.231519] env[67893]: _type = "Task" [ 2562.231519] env[67893]: } to complete. {{(pid=67893) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2562.241314] env[67893]: DEBUG oslo_vmware.api [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Task: {'id': session[522faadd-04b9-9c79-e1e4-fb685291944d]52a41f47-13b5-e4e0-3140-f0b3f1858a6f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67893) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2562.742611] env[67893]: DEBUG oslo_concurrency.lockutils [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2562.742984] env[67893]: DEBUG nova.virt.vmwareapi.vmops [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] [instance: a5520552-6a53-4351-8687-fc1b18186069] Processing image c08ce966-3dfb-4888-bf12-9a3e73350a20 {{(pid=67893) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2562.743070] env[67893]: DEBUG oslo_concurrency.lockutils [None req-6d7b58fe-9731-4d7c-8ee8-d3ad2af16645 tempest-ServerDiskConfigTestJSON-1171009476 tempest-ServerDiskConfigTestJSON-1171009476-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c08ce966-3dfb-4888-bf12-9a3e73350a20/c08ce966-3dfb-4888-bf12-9a3e73350a20.vmdk" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2563.357596] env[67893]: DEBUG nova.compute.manager [req-b9d642dc-a8c6-491f-a669-2c9bf62b60c0 req-f2318d6a-cf37-418e-a4c1-37f78d15b7c0 service nova] [instance: a5520552-6a53-4351-8687-fc1b18186069] Received event network-changed-8bc60e34-4303-4949-aa64-cfe300634f32 {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2563.357802] env[67893]: DEBUG nova.compute.manager [req-b9d642dc-a8c6-491f-a669-2c9bf62b60c0 req-f2318d6a-cf37-418e-a4c1-37f78d15b7c0 service nova] [instance: a5520552-6a53-4351-8687-fc1b18186069] Refreshing instance network info cache due to event network-changed-8bc60e34-4303-4949-aa64-cfe300634f32. {{(pid=67893) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2563.358031] env[67893]: DEBUG oslo_concurrency.lockutils [req-b9d642dc-a8c6-491f-a669-2c9bf62b60c0 req-f2318d6a-cf37-418e-a4c1-37f78d15b7c0 service nova] Acquiring lock "refresh_cache-a5520552-6a53-4351-8687-fc1b18186069" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2563.358181] env[67893]: DEBUG oslo_concurrency.lockutils [req-b9d642dc-a8c6-491f-a669-2c9bf62b60c0 req-f2318d6a-cf37-418e-a4c1-37f78d15b7c0 service nova] Acquired lock "refresh_cache-a5520552-6a53-4351-8687-fc1b18186069" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2563.358546] env[67893]: DEBUG nova.network.neutron [req-b9d642dc-a8c6-491f-a669-2c9bf62b60c0 req-f2318d6a-cf37-418e-a4c1-37f78d15b7c0 service nova] [instance: a5520552-6a53-4351-8687-fc1b18186069] Refreshing network info cache for port 8bc60e34-4303-4949-aa64-cfe300634f32 {{(pid=67893) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2563.598369] env[67893]: DEBUG nova.network.neutron [req-b9d642dc-a8c6-491f-a669-2c9bf62b60c0 req-f2318d6a-cf37-418e-a4c1-37f78d15b7c0 service nova] [instance: a5520552-6a53-4351-8687-fc1b18186069] Updated VIF entry in instance network info cache for port 8bc60e34-4303-4949-aa64-cfe300634f32. {{(pid=67893) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2563.598716] env[67893]: DEBUG nova.network.neutron [req-b9d642dc-a8c6-491f-a669-2c9bf62b60c0 req-f2318d6a-cf37-418e-a4c1-37f78d15b7c0 service nova] [instance: a5520552-6a53-4351-8687-fc1b18186069] Updating instance_info_cache with network_info: [{"id": "8bc60e34-4303-4949-aa64-cfe300634f32", "address": "fa:16:3e:59:9f:bb", "network": {"id": "b5038471-f3b2-4f1f-b2f9-62effa71f1aa", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1405799721-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7a19d9bde3814325847c06cec1af09b7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "89ef02af-c508-432f-ae29-3a219701d584", "external-id": "nsx-vlan-transportzone-313", "segmentation_id": 313, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8bc60e34-43", "ovs_interfaceid": "8bc60e34-4303-4949-aa64-cfe300634f32", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67893) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2563.607810] env[67893]: DEBUG oslo_concurrency.lockutils [req-b9d642dc-a8c6-491f-a669-2c9bf62b60c0 req-f2318d6a-cf37-418e-a4c1-37f78d15b7c0 service nova] Releasing lock "refresh_cache-a5520552-6a53-4351-8687-fc1b18186069" {{(pid=67893) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}}