[ 638.564312] env[68567]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 639.170382] env[68617]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 640.477414] env[68617]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=68617) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 640.477740] env[68617]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=68617) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 640.477839] env[68617]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=68617) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 640.478144] env[68617]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 640.679158] env[68617]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=68617) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} [ 640.689199] env[68617]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.010s {{(pid=68617) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} [ 640.792989] env[68617]: INFO nova.virt.driver [None req-6df83ca6-b09f-48c8-90a7-834f94634795 None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 640.864484] env[68617]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 640.864674] env[68617]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 640.864778] env[68617]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=68617) __init__ /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:242}} [ 643.708822] env[68617]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-5f589c0f-5fec-48d1-81dc-7611647223bb {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 643.725365] env[68617]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=68617) _create_session /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:242}} [ 643.725545] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-dbc51a11-51c6-48ee-8d0d-e319d42440df {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 643.758053] env[68617]: INFO oslo_vmware.api [-] Successfully established new session; session ID is 9e2d0. [ 643.758237] env[68617]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 2.894s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 643.758705] env[68617]: INFO nova.virt.vmwareapi.driver [None req-6df83ca6-b09f-48c8-90a7-834f94634795 None None] VMware vCenter version: 7.0.3 [ 643.762096] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33ffe3f5-2acb-40fd-9c40-c62fd06e7024 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 643.779808] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54e70ce4-c17f-4920-b688-aa43a29e1a6d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 643.785839] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ccc6ce94-7fd1-4c1a-b310-4d91d9a720dd {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 643.792446] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92bd501a-e554-40d5-9bec-fbd4753a591c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 643.805514] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89ae2c38-2f36-40b8-9319-968b396b563f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 643.811455] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfbfa9b4-b29f-4e4d-8e76-27716a1f7816 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 643.842868] env[68617]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-05738405-c81c-4c65-9a25-b2967b2a691b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 643.847791] env[68617]: DEBUG nova.virt.vmwareapi.driver [None req-6df83ca6-b09f-48c8-90a7-834f94634795 None None] Extension org.openstack.compute already exists. {{(pid=68617) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:224}} [ 643.850391] env[68617]: INFO nova.compute.provider_config [None req-6df83ca6-b09f-48c8-90a7-834f94634795 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 643.868217] env[68617]: DEBUG nova.context [None req-6df83ca6-b09f-48c8-90a7-834f94634795 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),f61933d9-4fa8-4efd-b566-35357a676a6c(cell1) {{(pid=68617) load_cells /opt/stack/nova/nova/context.py:464}} [ 643.870130] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 643.870358] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 643.871050] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 643.871460] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] Acquiring lock "f61933d9-4fa8-4efd-b566-35357a676a6c" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 643.871676] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] Lock "f61933d9-4fa8-4efd-b566-35357a676a6c" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 643.872648] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] Lock "f61933d9-4fa8-4efd-b566-35357a676a6c" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 643.897530] env[68617]: INFO dbcounter [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] Registered counter for database nova_cell0 [ 643.906319] env[68617]: INFO dbcounter [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] Registered counter for database nova_cell1 [ 643.909329] env[68617]: DEBUG oslo_db.sqlalchemy.engines [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=68617) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 643.909905] env[68617]: DEBUG oslo_db.sqlalchemy.engines [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=68617) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 643.914206] env[68617]: DEBUG dbcounter [-] [68617] Writer thread running {{(pid=68617) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 643.915078] env[68617]: DEBUG dbcounter [-] [68617] Writer thread running {{(pid=68617) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 643.917233] env[68617]: ERROR nova.db.main.api [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 643.917233] env[68617]: result = function(*args, **kwargs) [ 643.917233] env[68617]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 643.917233] env[68617]: return func(*args, **kwargs) [ 643.917233] env[68617]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 643.917233] env[68617]: result = fn(*args, **kwargs) [ 643.917233] env[68617]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 643.917233] env[68617]: return f(*args, **kwargs) [ 643.917233] env[68617]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 643.917233] env[68617]: return db.service_get_minimum_version(context, binaries) [ 643.917233] env[68617]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 643.917233] env[68617]: _check_db_access() [ 643.917233] env[68617]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 643.917233] env[68617]: stacktrace = ''.join(traceback.format_stack()) [ 643.917233] env[68617]: [ 643.918262] env[68617]: ERROR nova.db.main.api [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 643.918262] env[68617]: result = function(*args, **kwargs) [ 643.918262] env[68617]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 643.918262] env[68617]: return func(*args, **kwargs) [ 643.918262] env[68617]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 643.918262] env[68617]: result = fn(*args, **kwargs) [ 643.918262] env[68617]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 643.918262] env[68617]: return f(*args, **kwargs) [ 643.918262] env[68617]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 643.918262] env[68617]: return db.service_get_minimum_version(context, binaries) [ 643.918262] env[68617]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 643.918262] env[68617]: _check_db_access() [ 643.918262] env[68617]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 643.918262] env[68617]: stacktrace = ''.join(traceback.format_stack()) [ 643.918262] env[68617]: [ 643.918885] env[68617]: WARNING nova.objects.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 643.918885] env[68617]: WARNING nova.objects.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] Failed to get minimum service version for cell f61933d9-4fa8-4efd-b566-35357a676a6c [ 643.919186] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] Acquiring lock "singleton_lock" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 643.919349] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] Acquired lock "singleton_lock" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 643.919587] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] Releasing lock "singleton_lock" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 643.919929] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] Full set of CONF: {{(pid=68617) _wait_for_exit_or_signal /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/service.py:362}} [ 643.920088] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ******************************************************************************** {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2589}} [ 643.920220] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] Configuration options gathered from: {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2590}} [ 643.920354] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2591}} [ 643.920540] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2592}} [ 643.920669] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ================================================================================ {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2594}} [ 643.920875] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] allow_resize_to_same_host = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.921057] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] arq_binding_timeout = 300 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.921190] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] backdoor_port = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.921315] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] backdoor_socket = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.921475] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] block_device_allocate_retries = 60 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.921640] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] block_device_allocate_retries_interval = 3 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.921811] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cert = self.pem {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.921975] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.922159] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] compute_monitors = [] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.922324] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] config_dir = [] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.922492] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] config_drive_format = iso9660 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.922649] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.922821] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] config_source = [] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.922986] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] console_host = devstack {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.923179] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] control_exchange = nova {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.923338] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cpu_allocation_ratio = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.923496] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] daemon = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.923664] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] debug = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.923845] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] default_access_ip_network_name = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.924044] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] default_availability_zone = nova {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.924209] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] default_ephemeral_format = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.924368] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] default_green_pool_size = 1000 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.924607] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.924836] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] default_schedule_zone = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.925008] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] disk_allocation_ratio = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.925176] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] enable_new_services = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.925353] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] enabled_apis = ['osapi_compute'] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.925519] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] enabled_ssl_apis = [] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.925739] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] flat_injected = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.925918] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] force_config_drive = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.926094] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] force_raw_images = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.926271] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] graceful_shutdown_timeout = 5 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.926432] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] heal_instance_info_cache_interval = 60 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.926646] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] host = cpu-1 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.926816] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] initial_cpu_allocation_ratio = 4.0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.927289] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] initial_disk_allocation_ratio = 1.0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.927342] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] initial_ram_allocation_ratio = 1.0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.927553] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.927719] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] instance_build_timeout = 0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.927879] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] instance_delete_interval = 300 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.928058] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] instance_format = [instance: %(uuid)s] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.928229] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] instance_name_template = instance-%08x {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.928387] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] instance_usage_audit = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.928556] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] instance_usage_audit_period = month {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.928753] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.928927] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] instances_path = /opt/stack/data/nova/instances {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.929104] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] internal_service_availability_zone = internal {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.929261] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] key = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.929418] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] live_migration_retry_count = 30 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.929578] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] log_config_append = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.929740] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.929901] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] log_dir = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.930067] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] log_file = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.930196] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] log_options = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.930358] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] log_rotate_interval = 1 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.930523] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] log_rotate_interval_type = days {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.930686] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] log_rotation_type = none {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.930814] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.930930] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.931108] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.931274] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.931400] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.931559] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] long_rpc_timeout = 1800 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.931752] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] max_concurrent_builds = 10 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.931919] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] max_concurrent_live_migrations = 1 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.932088] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] max_concurrent_snapshots = 5 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.932253] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] max_local_block_devices = 3 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.932410] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] max_logfile_count = 30 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.932568] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] max_logfile_size_mb = 200 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.932818] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] maximum_instance_delete_attempts = 5 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.932959] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] metadata_listen = 0.0.0.0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.933167] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] metadata_listen_port = 8775 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.933343] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] metadata_workers = 2 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.933504] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] migrate_max_retries = -1 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.933672] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] mkisofs_cmd = genisoimage {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.933879] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] my_block_storage_ip = 10.180.1.21 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.934021] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] my_ip = 10.180.1.21 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.934185] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] network_allocate_retries = 0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.934361] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.934527] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] osapi_compute_listen = 0.0.0.0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.934753] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] osapi_compute_listen_port = 8774 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.934933] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] osapi_compute_unique_server_name_scope = {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.935117] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] osapi_compute_workers = 2 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.935288] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] password_length = 12 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.935448] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] periodic_enable = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.935630] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] periodic_fuzzy_delay = 60 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.935808] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] pointer_model = usbtablet {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.936015] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] preallocate_images = none {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.936194] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] publish_errors = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.936327] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] pybasedir = /opt/stack/nova {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.936484] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ram_allocation_ratio = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.936644] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] rate_limit_burst = 0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.936813] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] rate_limit_except_level = CRITICAL {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.936972] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] rate_limit_interval = 0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.937144] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] reboot_timeout = 0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.937305] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] reclaim_instance_interval = 0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.937462] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] record = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.937629] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] reimage_timeout_per_gb = 60 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.937794] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] report_interval = 120 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.937955] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] rescue_timeout = 0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.938127] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] reserved_host_cpus = 0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.938288] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] reserved_host_disk_mb = 0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.938445] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] reserved_host_memory_mb = 512 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.938603] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] reserved_huge_pages = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.938759] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] resize_confirm_window = 0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.938971] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] resize_fs_using_block_device = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.939117] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] resume_guests_state_on_host_boot = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.939326] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.939502] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] rpc_response_timeout = 60 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.939691] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] run_external_periodic_tasks = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.939873] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] running_deleted_instance_action = reap {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.940048] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] running_deleted_instance_poll_interval = 1800 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.940210] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] running_deleted_instance_timeout = 0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.940366] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] scheduler_instance_sync_interval = 120 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.940534] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] service_down_time = 720 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.940702] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] servicegroup_driver = db {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.940864] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] shelved_offload_time = 0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.941028] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] shelved_poll_interval = 3600 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.941198] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] shutdown_timeout = 0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.941357] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] source_is_ipv6 = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.941513] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ssl_only = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.941755] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.941928] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] sync_power_state_interval = 600 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.942127] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] sync_power_state_pool_size = 1000 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.942301] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] syslog_log_facility = LOG_USER {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.942458] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] tempdir = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.942617] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] timeout_nbd = 10 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.942777] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] transport_url = **** {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.942935] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] update_resources_interval = 0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.943104] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] use_cow_images = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.943262] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] use_eventlog = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.943417] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] use_journal = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.943574] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] use_json = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.943753] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] use_rootwrap_daemon = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.943886] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] use_stderr = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.944045] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] use_syslog = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.944203] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vcpu_pin_set = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.944371] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vif_plugging_is_fatal = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.944538] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vif_plugging_timeout = 300 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.944728] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] virt_mkfs = [] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.944897] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] volume_usage_poll_interval = 0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.945094] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] watch_log_file = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.945273] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] web = /usr/share/spice-html5 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 643.945461] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_concurrency.disable_process_locking = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.945779] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.945958] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.946140] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.946312] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_metrics.metrics_process_name = {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.946480] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.946643] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.946826] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api.auth_strategy = keystone {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.946997] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api.compute_link_prefix = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.947177] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.947350] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api.dhcp_domain = novalocal {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.947517] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api.enable_instance_password = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.947699] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api.glance_link_prefix = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.947880] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.948086] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api.instance_list_cells_batch_strategy = distributed {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.948268] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api.instance_list_per_project_cells = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.948430] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api.list_records_by_skipping_down_cells = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.948593] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api.local_metadata_per_cell = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.948760] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api.max_limit = 1000 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.948929] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api.metadata_cache_expiration = 15 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.949115] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api.neutron_default_tenant_id = default {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.949285] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api.use_forwarded_for = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.949450] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api.use_neutron_default_nets = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.949618] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.949785] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api.vendordata_dynamic_failure_fatal = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.950030] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.950214] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api.vendordata_dynamic_ssl_certfile = {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.950392] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api.vendordata_dynamic_targets = [] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.950556] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api.vendordata_jsonfile_path = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.950736] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api.vendordata_providers = ['StaticJSON'] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.950930] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.backend = dogpile.cache.memcached {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.951137] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.backend_argument = **** {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.951326] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.config_prefix = cache.oslo {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.951496] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.dead_timeout = 60.0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.951661] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.debug_cache_backend = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.951826] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.enable_retry_client = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.951987] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.enable_socket_keepalive = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.952173] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.enabled = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.952339] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.expiration_time = 600 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.952507] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.hashclient_retry_attempts = 2 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.952671] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.hashclient_retry_delay = 1.0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.952833] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.memcache_dead_retry = 300 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.953007] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.memcache_password = {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.953179] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.953346] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.953509] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.memcache_pool_maxsize = 10 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.953674] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.memcache_pool_unused_timeout = 60 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.953840] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.memcache_sasl_enabled = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.954055] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.memcache_servers = ['localhost:11211'] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.954246] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.memcache_socket_timeout = 1.0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.954417] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.memcache_username = {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.954583] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.proxies = [] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.954774] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.retry_attempts = 2 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.954948] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.retry_delay = 0.0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.955126] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.socket_keepalive_count = 1 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.955289] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.socket_keepalive_idle = 1 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.955446] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.socket_keepalive_interval = 1 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.955610] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.tls_allowed_ciphers = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.955790] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.tls_cafile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.955952] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.tls_certfile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.956130] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.tls_enabled = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.956291] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cache.tls_keyfile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.956476] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cinder.auth_section = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.956686] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cinder.auth_type = password {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.956857] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cinder.cafile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.957047] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cinder.catalog_info = volumev3::publicURL {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.957214] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cinder.certfile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.957380] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cinder.collect_timing = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.957543] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cinder.cross_az_attach = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.957703] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cinder.debug = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.957863] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cinder.endpoint_template = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.958036] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cinder.http_retries = 3 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.958203] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cinder.insecure = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.958364] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cinder.keyfile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.958537] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cinder.os_region_name = RegionOne {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.958702] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cinder.split_loggers = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.958889] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cinder.timeout = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.959042] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.959209] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] compute.cpu_dedicated_set = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.959367] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] compute.cpu_shared_set = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.959549] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] compute.image_type_exclude_list = [] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.959745] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] compute.live_migration_wait_for_vif_plug = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.959967] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] compute.max_concurrent_disk_ops = 0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.960169] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] compute.max_disk_devices_to_attach = -1 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.960339] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.960512] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.960676] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] compute.resource_provider_association_refresh = 300 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.960843] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] compute.shutdown_retry_interval = 10 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.961032] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.961214] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] conductor.workers = 2 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.961386] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] console.allowed_origins = [] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.961550] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] console.ssl_ciphers = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.961721] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] console.ssl_minimum_version = default {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.961896] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] consoleauth.token_ttl = 600 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.962074] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cyborg.cafile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.962236] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cyborg.certfile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.962401] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cyborg.collect_timing = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.962595] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cyborg.connect_retries = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.962781] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cyborg.connect_retry_delay = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.962944] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cyborg.endpoint_override = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.963121] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cyborg.insecure = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.963286] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cyborg.keyfile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.963449] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cyborg.max_version = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.963606] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cyborg.min_version = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.963765] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cyborg.region_name = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.963925] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cyborg.service_name = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.964106] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cyborg.service_type = accelerator {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.964274] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cyborg.split_loggers = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.964435] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cyborg.status_code_retries = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.964592] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cyborg.status_code_retry_delay = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.964773] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cyborg.timeout = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.964961] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.965204] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] cyborg.version = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.965452] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] database.backend = sqlalchemy {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.966683] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] database.connection = **** {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.966683] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] database.connection_debug = 0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.966683] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] database.connection_parameters = {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.966683] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] database.connection_recycle_time = 3600 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.966683] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] database.connection_trace = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.966826] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] database.db_inc_retry_interval = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.966924] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] database.db_max_retries = 20 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.967150] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] database.db_max_retry_interval = 10 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.967366] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] database.db_retry_interval = 1 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.967559] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] database.max_overflow = 50 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.967729] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] database.max_pool_size = 5 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.967903] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] database.max_retries = 10 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.968087] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] database.mysql_sql_mode = TRADITIONAL {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.968255] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] database.mysql_wsrep_sync_wait = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.968418] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] database.pool_timeout = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.968586] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] database.retry_interval = 10 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.968745] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] database.slave_connection = **** {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.968912] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] database.sqlite_synchronous = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.969086] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] database.use_db_reconnect = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.969284] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api_database.backend = sqlalchemy {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.969444] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api_database.connection = **** {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.969610] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api_database.connection_debug = 0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.969785] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api_database.connection_parameters = {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.969948] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api_database.connection_recycle_time = 3600 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.970125] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api_database.connection_trace = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.970289] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api_database.db_inc_retry_interval = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.970449] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api_database.db_max_retries = 20 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.970608] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api_database.db_max_retry_interval = 10 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.970768] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api_database.db_retry_interval = 1 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.970938] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api_database.max_overflow = 50 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.971111] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api_database.max_pool_size = 5 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.971278] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api_database.max_retries = 10 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.971445] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.971602] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api_database.mysql_wsrep_sync_wait = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.971781] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api_database.pool_timeout = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.971988] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api_database.retry_interval = 10 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.972162] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api_database.slave_connection = **** {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.972329] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] api_database.sqlite_synchronous = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.972502] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] devices.enabled_mdev_types = [] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.972705] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.972883] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ephemeral_storage_encryption.enabled = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.973068] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ephemeral_storage_encryption.key_size = 512 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.973244] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] glance.api_servers = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.973411] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] glance.cafile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.973575] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] glance.certfile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.973741] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] glance.collect_timing = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.973903] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] glance.connect_retries = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.974077] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] glance.connect_retry_delay = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.974246] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] glance.debug = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.974411] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] glance.default_trusted_certificate_ids = [] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.974574] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] glance.enable_certificate_validation = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.974770] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] glance.enable_rbd_download = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.974937] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] glance.endpoint_override = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.975119] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] glance.insecure = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.975286] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] glance.keyfile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.975446] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] glance.max_version = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.975609] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] glance.min_version = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.975789] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] glance.num_retries = 3 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.975995] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] glance.rbd_ceph_conf = {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.976197] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] glance.rbd_connect_timeout = 5 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.976373] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] glance.rbd_pool = {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.976542] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] glance.rbd_user = {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.976702] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] glance.region_name = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.976863] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] glance.service_name = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.977040] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] glance.service_type = image {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.977209] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] glance.split_loggers = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.977368] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] glance.status_code_retries = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.977525] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] glance.status_code_retry_delay = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.977682] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] glance.timeout = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.977863] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.978036] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] glance.verify_glance_signatures = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.978199] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] glance.version = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.978366] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] guestfs.debug = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.978537] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] hyperv.config_drive_cdrom = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.978702] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] hyperv.config_drive_inject_password = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.978870] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.979045] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] hyperv.enable_instance_metrics_collection = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.979213] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] hyperv.enable_remotefx = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.979397] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] hyperv.instances_path_share = {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.979549] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] hyperv.iscsi_initiator_list = [] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.979712] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] hyperv.limit_cpu_features = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.979876] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.980050] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.980219] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] hyperv.power_state_check_timeframe = 60 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.980390] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] hyperv.power_state_event_polling_interval = 2 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.980560] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.980724] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] hyperv.use_multipath_io = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.980889] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] hyperv.volume_attach_retry_count = 10 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.981058] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] hyperv.volume_attach_retry_interval = 5 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.981221] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] hyperv.vswitch_name = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.981383] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.981549] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] mks.enabled = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.981918] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.982171] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] image_cache.manager_interval = 2400 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.982355] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] image_cache.precache_concurrency = 1 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.982530] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] image_cache.remove_unused_base_images = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.982773] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.982984] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.983218] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] image_cache.subdirectory_name = _base {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.983404] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ironic.api_max_retries = 60 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.983571] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ironic.api_retry_interval = 2 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.983734] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ironic.auth_section = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.983900] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ironic.auth_type = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.984073] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ironic.cafile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.984237] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ironic.certfile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.984400] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ironic.collect_timing = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.984563] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ironic.conductor_group = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.984743] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ironic.connect_retries = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.984909] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ironic.connect_retry_delay = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.985080] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ironic.endpoint_override = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.985247] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ironic.insecure = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.985405] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ironic.keyfile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.985564] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ironic.max_version = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.985726] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ironic.min_version = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.985889] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ironic.peer_list = [] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.986057] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ironic.region_name = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.986224] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ironic.serial_console_state_timeout = 10 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.986380] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ironic.service_name = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.986546] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ironic.service_type = baremetal {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.986706] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ironic.split_loggers = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.986863] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ironic.status_code_retries = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.987020] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ironic.status_code_retry_delay = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.987183] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ironic.timeout = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.987362] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.987523] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ironic.version = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.987703] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.987877] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] key_manager.fixed_key = **** {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.988068] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.988234] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] barbican.barbican_api_version = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.988391] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] barbican.barbican_endpoint = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.988561] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] barbican.barbican_endpoint_type = public {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.988721] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] barbican.barbican_region_name = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.988881] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] barbican.cafile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.989051] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] barbican.certfile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.989218] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] barbican.collect_timing = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.989378] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] barbican.insecure = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.989534] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] barbican.keyfile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.989695] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] barbican.number_of_retries = 60 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.989859] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] barbican.retry_delay = 1 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.990220] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] barbican.send_service_user_token = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.990220] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] barbican.split_loggers = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.990340] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] barbican.timeout = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.990500] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] barbican.verify_ssl = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.990656] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] barbican.verify_ssl_path = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.990822] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] barbican_service_user.auth_section = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.990985] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] barbican_service_user.auth_type = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.991159] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] barbican_service_user.cafile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.991318] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] barbican_service_user.certfile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.991481] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] barbican_service_user.collect_timing = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.991642] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] barbican_service_user.insecure = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.991800] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] barbican_service_user.keyfile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.991963] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] barbican_service_user.split_loggers = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.992133] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] barbican_service_user.timeout = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.992301] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vault.approle_role_id = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.992460] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vault.approle_secret_id = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.992625] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vault.cafile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.992810] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vault.certfile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.992974] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vault.collect_timing = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.993150] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vault.insecure = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.993311] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vault.keyfile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.993483] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vault.kv_mountpoint = secret {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.993644] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vault.kv_path = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.993814] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vault.kv_version = 2 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.993974] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vault.namespace = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.994143] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vault.root_token_id = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.994315] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vault.split_loggers = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.994466] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vault.ssl_ca_crt_file = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.994629] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vault.timeout = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.994815] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vault.use_ssl = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.994994] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.995179] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] keystone.auth_section = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.995345] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] keystone.auth_type = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.995504] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] keystone.cafile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.995663] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] keystone.certfile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.995831] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] keystone.collect_timing = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.995989] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] keystone.connect_retries = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.996164] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] keystone.connect_retry_delay = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.996324] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] keystone.endpoint_override = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.996484] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] keystone.insecure = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.996642] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] keystone.keyfile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.996801] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] keystone.max_version = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.996969] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] keystone.min_version = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.997156] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] keystone.region_name = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.997317] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] keystone.service_name = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.997484] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] keystone.service_type = identity {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.997645] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] keystone.split_loggers = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.997806] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] keystone.status_code_retries = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.997964] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] keystone.status_code_retry_delay = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.998134] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] keystone.timeout = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.998314] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.998475] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] keystone.version = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.998674] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.connection_uri = {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.998839] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.cpu_mode = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.999015] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.cpu_model_extra_flags = [] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.999190] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.cpu_models = [] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.999362] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.cpu_power_governor_high = performance {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.999530] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.cpu_power_governor_low = powersave {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.999696] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.cpu_power_management = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 643.999872] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.000042] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.device_detach_attempts = 8 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.000225] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.device_detach_timeout = 20 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.000446] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.disk_cachemodes = [] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.000619] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.disk_prefix = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.000793] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.enabled_perf_events = [] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.000958] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.file_backed_memory = 0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.001142] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.gid_maps = [] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.001305] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.hw_disk_discard = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.001463] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.hw_machine_type = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.001633] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.images_rbd_ceph_conf = {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.001797] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.001964] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.002143] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.images_rbd_glance_store_name = {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.002312] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.images_rbd_pool = rbd {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.002480] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.images_type = default {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.002650] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.images_volume_group = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.002831] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.inject_key = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.002997] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.inject_partition = -2 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.003173] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.inject_password = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.003336] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.iscsi_iface = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.003496] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.iser_use_multipath = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.003657] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.live_migration_bandwidth = 0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.003821] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.live_migration_completion_timeout = 800 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.003983] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.live_migration_downtime = 500 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.004159] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.live_migration_downtime_delay = 75 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.004324] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.live_migration_downtime_steps = 10 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.004483] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.live_migration_inbound_addr = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.004663] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.live_migration_permit_auto_converge = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.004845] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.live_migration_permit_post_copy = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.005015] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.live_migration_scheme = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.005192] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.live_migration_timeout_action = abort {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.005355] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.live_migration_tunnelled = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.005513] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.live_migration_uri = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.005676] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.live_migration_with_native_tls = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.005836] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.max_queues = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.005998] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.mem_stats_period_seconds = 10 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.006170] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.nfs_mount_options = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.006479] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.006653] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.num_aoe_discover_tries = 3 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.006820] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.num_iser_scan_tries = 5 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.007023] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.num_memory_encrypted_guests = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.007218] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.num_nvme_discover_tries = 5 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.007386] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.num_pcie_ports = 0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.007552] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.num_volume_scan_tries = 5 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.007718] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.pmem_namespaces = [] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.007879] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.quobyte_client_cfg = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.008176] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.008353] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.rbd_connect_timeout = 5 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.008519] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.008682] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.008845] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.rbd_secret_uuid = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.009013] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.rbd_user = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.009182] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.realtime_scheduler_priority = 1 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.009352] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.remote_filesystem_transport = ssh {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.009511] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.rescue_image_id = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.009667] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.rescue_kernel_id = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.009825] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.rescue_ramdisk_id = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.009992] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.rng_dev_path = /dev/urandom {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.010220] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.rx_queue_size = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.010404] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.smbfs_mount_options = {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.010682] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.010860] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.snapshot_compression = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.011033] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.snapshot_image_format = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.011261] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.011429] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.sparse_logical_volumes = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.011595] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.swtpm_enabled = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.011765] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.swtpm_group = tss {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.011937] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.swtpm_user = tss {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.012118] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.sysinfo_serial = unique {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.012281] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.tb_cache_size = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.012438] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.tx_queue_size = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.012604] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.uid_maps = [] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.012795] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.use_virtio_for_bridges = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.012997] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.virt_type = kvm {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.013153] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.volume_clear = zero {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.013316] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.volume_clear_size = 0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.013480] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.volume_use_multipath = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.013639] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.vzstorage_cache_path = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.013820] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.013979] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.vzstorage_mount_group = qemu {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.014155] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.vzstorage_mount_opts = [] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.014324] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.014599] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.014803] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.vzstorage_mount_user = stack {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.014976] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.015163] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] neutron.auth_section = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.015355] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] neutron.auth_type = password {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.015519] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] neutron.cafile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.015679] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] neutron.certfile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.015843] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] neutron.collect_timing = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.016008] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] neutron.connect_retries = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.016174] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] neutron.connect_retry_delay = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.016343] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] neutron.default_floating_pool = public {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.016501] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] neutron.endpoint_override = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.016663] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] neutron.extension_sync_interval = 600 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.016828] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] neutron.http_retries = 3 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.016989] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] neutron.insecure = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.017160] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] neutron.keyfile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.017318] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] neutron.max_version = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.017486] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] neutron.metadata_proxy_shared_secret = **** {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.017642] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] neutron.min_version = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.017810] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] neutron.ovs_bridge = br-int {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.017973] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] neutron.physnets = [] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.018154] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] neutron.region_name = RegionOne {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.018322] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] neutron.service_metadata_proxy = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.018480] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] neutron.service_name = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.018647] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] neutron.service_type = network {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.018810] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] neutron.split_loggers = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.018967] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] neutron.status_code_retries = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.019137] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] neutron.status_code_retry_delay = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.019296] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] neutron.timeout = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.019473] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.019634] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] neutron.version = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.019810] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] notifications.bdms_in_notifications = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.019990] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] notifications.default_level = INFO {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.020176] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] notifications.notification_format = unversioned {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.020338] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] notifications.notify_on_state_change = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.020511] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.020684] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] pci.alias = [] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.020855] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] pci.device_spec = [] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.021081] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] pci.report_in_placement = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.021268] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.auth_section = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.021443] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.auth_type = password {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.021612] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.auth_url = http://10.180.1.21/identity {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.021773] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.cafile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.021931] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.certfile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.022122] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.collect_timing = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.022264] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.connect_retries = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.022421] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.connect_retry_delay = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.022577] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.default_domain_id = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.022735] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.default_domain_name = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.022896] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.domain_id = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.023069] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.domain_name = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.023221] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.endpoint_override = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.023379] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.insecure = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.023531] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.keyfile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.023684] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.max_version = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.023839] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.min_version = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.024009] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.password = **** {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.024170] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.project_domain_id = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.024334] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.project_domain_name = Default {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.024497] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.project_id = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.024693] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.project_name = service {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.024876] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.region_name = RegionOne {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.025174] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.service_name = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.025231] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.service_type = placement {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.025373] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.split_loggers = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.025529] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.status_code_retries = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.025689] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.status_code_retry_delay = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.025848] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.system_scope = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.026010] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.timeout = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.026177] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.trust_id = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.026331] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.user_domain_id = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.026496] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.user_domain_name = Default {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.026651] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.user_id = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.026822] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.username = placement {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.027035] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.027213] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] placement.version = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.027392] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] quota.cores = 20 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.027558] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] quota.count_usage_from_placement = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.027728] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.027901] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] quota.injected_file_content_bytes = 10240 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.028079] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] quota.injected_file_path_length = 255 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.028250] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] quota.injected_files = 5 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.028419] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] quota.instances = 10 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.028584] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] quota.key_pairs = 100 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.028750] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] quota.metadata_items = 128 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.028917] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] quota.ram = 51200 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.029091] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] quota.recheck_quota = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.029264] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] quota.server_group_members = 10 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.029430] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] quota.server_groups = 10 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.029599] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] rdp.enabled = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.029908] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.030113] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.030288] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.030454] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] scheduler.image_metadata_prefilter = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.030619] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.030788] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] scheduler.max_attempts = 3 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.030952] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] scheduler.max_placement_results = 1000 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.031130] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.031292] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] scheduler.query_placement_for_image_type_support = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.031453] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.031625] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] scheduler.workers = 2 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.031798] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.031969] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.032162] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.032336] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.032501] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.032687] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.032869] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.033070] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.033246] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] filter_scheduler.host_subset_size = 1 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.033412] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.033575] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] filter_scheduler.image_properties_default_architecture = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.033743] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.033911] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] filter_scheduler.isolated_hosts = [] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.034086] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] filter_scheduler.isolated_images = [] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.034253] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] filter_scheduler.max_instances_per_host = 50 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.034415] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.034578] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] filter_scheduler.num_instances_weight_multiplier = 0.0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.034780] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] filter_scheduler.pci_in_placement = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.034957] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.035139] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.035310] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.035472] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.035647] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.035826] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.035991] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] filter_scheduler.track_instance_changes = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.036183] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.036357] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] metrics.required = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.036522] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] metrics.weight_multiplier = 1.0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.036687] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] metrics.weight_of_unavailable = -10000.0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.036854] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] metrics.weight_setting = [] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.037189] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.037368] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] serial_console.enabled = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.037545] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] serial_console.port_range = 10000:20000 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.037717] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.037893] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.038073] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] serial_console.serialproxy_port = 6083 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.038247] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] service_user.auth_section = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.038420] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] service_user.auth_type = password {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.038583] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] service_user.cafile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.038742] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] service_user.certfile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.038906] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] service_user.collect_timing = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.039113] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] service_user.insecure = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.039242] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] service_user.keyfile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.039426] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] service_user.send_service_user_token = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.039588] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] service_user.split_loggers = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.039750] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] service_user.timeout = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.039920] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] spice.agent_enabled = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.040144] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] spice.enabled = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.040393] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.040584] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] spice.html5proxy_host = 0.0.0.0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.040753] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] spice.html5proxy_port = 6082 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.040917] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] spice.image_compression = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.041087] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] spice.jpeg_compression = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.041250] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] spice.playback_compression = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.041418] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] spice.server_listen = 127.0.0.1 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.041585] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.041742] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] spice.streaming_mode = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.041903] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] spice.zlib_compression = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.042077] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] upgrade_levels.baseapi = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.042239] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] upgrade_levels.cert = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.042406] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] upgrade_levels.compute = auto {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.042565] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] upgrade_levels.conductor = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.042780] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] upgrade_levels.scheduler = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.042975] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vendordata_dynamic_auth.auth_section = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.043159] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vendordata_dynamic_auth.auth_type = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.043322] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vendordata_dynamic_auth.cafile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.043479] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vendordata_dynamic_auth.certfile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.043643] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vendordata_dynamic_auth.collect_timing = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.043804] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vendordata_dynamic_auth.insecure = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.043962] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vendordata_dynamic_auth.keyfile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.044136] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vendordata_dynamic_auth.split_loggers = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.044294] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vendordata_dynamic_auth.timeout = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.044468] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vmware.api_retry_count = 10 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.044651] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vmware.ca_file = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.044837] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vmware.cache_prefix = devstack-image-cache {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.045014] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vmware.cluster_name = testcl1 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.045189] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vmware.connection_pool_size = 10 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.045406] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vmware.console_delay_seconds = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.045606] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vmware.datastore_regex = ^datastore.* {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.045821] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.045995] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vmware.host_password = **** {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.046180] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vmware.host_port = 443 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.046350] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vmware.host_username = administrator@vsphere.local {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.046521] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vmware.insecure = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.046683] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vmware.integration_bridge = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.046849] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vmware.maximum_objects = 100 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.047021] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vmware.pbm_default_policy = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.047184] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vmware.pbm_enabled = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.047343] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vmware.pbm_wsdl_location = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.047511] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.047670] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vmware.serial_port_proxy_uri = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.047831] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vmware.serial_port_service_uri = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.047997] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vmware.task_poll_interval = 0.5 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.048184] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vmware.use_linked_clone = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.048354] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vmware.vnc_keymap = en-us {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.048519] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vmware.vnc_port = 5900 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.048684] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vmware.vnc_port_total = 10000 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.048871] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vnc.auth_schemes = ['none'] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.049057] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vnc.enabled = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.049349] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.049535] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.049707] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vnc.novncproxy_port = 6080 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.049888] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vnc.server_listen = 127.0.0.1 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.050068] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.050254] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vnc.vencrypt_ca_certs = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.050392] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vnc.vencrypt_client_cert = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.050548] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vnc.vencrypt_client_key = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.050722] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.050888] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] workarounds.disable_deep_image_inspection = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.051060] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] workarounds.disable_fallback_pcpu_query = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.051224] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] workarounds.disable_group_policy_check_upcall = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.051385] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.051545] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] workarounds.disable_rootwrap = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.051705] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] workarounds.enable_numa_live_migration = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.051868] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.052045] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.052211] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] workarounds.handle_virt_lifecycle_events = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.052371] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] workarounds.libvirt_disable_apic = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.052531] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] workarounds.never_download_image_if_on_rbd = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.052720] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.052900] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.053074] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.053239] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.053402] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.053561] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.053722] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.053885] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.054059] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.054248] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.054421] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] wsgi.client_socket_timeout = 900 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.054590] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] wsgi.default_pool_size = 1000 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.054781] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] wsgi.keep_alive = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.054957] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] wsgi.max_header_line = 16384 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.055136] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] wsgi.secure_proxy_ssl_header = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.055299] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] wsgi.ssl_ca_file = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.055460] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] wsgi.ssl_cert_file = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.055625] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] wsgi.ssl_key_file = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.055791] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] wsgi.tcp_keepidle = 600 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.055967] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.056149] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] zvm.ca_file = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.056310] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] zvm.cloud_connector_url = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.056590] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.056765] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] zvm.reachable_timeout = 300 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.056957] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_policy.enforce_new_defaults = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.057158] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_policy.enforce_scope = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.057337] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_policy.policy_default_rule = default {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.057520] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.057694] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_policy.policy_file = policy.yaml {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.057869] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.058040] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.058205] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.058364] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.058527] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.058693] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.058868] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.059054] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] profiler.connection_string = messaging:// {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.059223] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] profiler.enabled = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.059392] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] profiler.es_doc_type = notification {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.059553] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] profiler.es_scroll_size = 10000 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.059721] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] profiler.es_scroll_time = 2m {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.059887] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] profiler.filter_error_trace = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.060066] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] profiler.hmac_keys = **** {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.060238] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] profiler.sentinel_service_name = mymaster {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.060402] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] profiler.socket_timeout = 0.1 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.060564] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] profiler.trace_requests = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.060722] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] profiler.trace_sqlalchemy = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.060904] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] profiler_jaeger.process_tags = {} {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.061077] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] profiler_jaeger.service_name_prefix = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.061241] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] profiler_otlp.service_name_prefix = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.061403] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] remote_debug.host = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.061560] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] remote_debug.port = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.061736] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.061901] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.062113] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.062289] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.062455] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.062618] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.062777] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.062942] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.063115] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.063275] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.063457] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.063609] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.063778] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.063945] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.064121] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.064294] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.064458] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.064624] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.064810] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.064980] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.065155] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.065324] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.065482] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.065644] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.065808] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.065972] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.ssl = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.066158] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.066327] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.066489] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.066659] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.066828] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_rabbit.ssl_version = {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.067043] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.067225] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_notifications.retry = -1 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.067413] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.067588] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_messaging_notifications.transport_url = **** {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.067760] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_limit.auth_section = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.067925] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_limit.auth_type = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.068096] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_limit.cafile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.068266] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_limit.certfile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.068430] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_limit.collect_timing = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.068587] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_limit.connect_retries = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.068745] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_limit.connect_retry_delay = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.068906] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_limit.endpoint_id = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.069067] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_limit.endpoint_override = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.069230] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_limit.insecure = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.069384] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_limit.keyfile = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.069538] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_limit.max_version = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.069692] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_limit.min_version = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.069850] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_limit.region_name = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.070010] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_limit.service_name = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.070168] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_limit.service_type = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.070325] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_limit.split_loggers = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.070480] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_limit.status_code_retries = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.070637] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_limit.status_code_retry_delay = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.070793] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_limit.timeout = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.070948] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_limit.valid_interfaces = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.071142] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_limit.version = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.071326] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_reports.file_event_handler = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.071492] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_reports.file_event_handler_interval = 1 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.071651] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] oslo_reports.log_dir = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.071821] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.071982] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vif_plug_linux_bridge_privileged.group = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.072155] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.072324] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.072487] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.072662] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vif_plug_linux_bridge_privileged.user = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.072851] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.073025] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vif_plug_ovs_privileged.group = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.073189] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vif_plug_ovs_privileged.helper_command = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.073354] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.073554] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.073670] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] vif_plug_ovs_privileged.user = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.073839] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] os_vif_linux_bridge.flat_interface = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.074026] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.074204] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.074373] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.074544] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.074739] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.074939] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.075121] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] os_vif_linux_bridge.vlan_interface = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.075303] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] os_vif_ovs.default_qos_type = linux-noop {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.075475] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] os_vif_ovs.isolate_vif = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.075646] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.075811] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.075979] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.076167] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] os_vif_ovs.ovsdb_interface = native {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.076330] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] os_vif_ovs.per_port_bridge = False {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.076494] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] os_brick.lock_path = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.076657] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] os_brick.wait_mpath_device_attempts = 4 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.076819] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] os_brick.wait_mpath_device_interval = 1 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.077025] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] privsep_osbrick.capabilities = [21] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.077213] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] privsep_osbrick.group = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.077376] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] privsep_osbrick.helper_command = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.077540] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.077703] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] privsep_osbrick.thread_pool_size = 8 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.077864] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] privsep_osbrick.user = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.078045] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.078208] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] nova_sys_admin.group = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.078365] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] nova_sys_admin.helper_command = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.078530] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.078693] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] nova_sys_admin.thread_pool_size = 8 {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.078852] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] nova_sys_admin.user = None {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 644.078981] env[68617]: DEBUG oslo_service.service [None req-3192a2ba-a978-438f-bc86-d4d115ecebcc None None] ******************************************************************************** {{(pid=68617) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2613}} [ 644.079395] env[68617]: INFO nova.service [-] Starting compute node (version 0.0.1) [ 644.089852] env[68617]: WARNING nova.virt.vmwareapi.driver [None req-82063752-0f5a-4c0f-9d38-607651de22d0 None None] The vmwareapi driver is not tested by the OpenStack project nor does it have clear maintainer(s) and thus its quality can not be ensured. It should be considered experimental and may be removed in a future release. If you are using the driver in production please let us know via the openstack-discuss mailing list. [ 644.090298] env[68617]: INFO nova.virt.node [None req-82063752-0f5a-4c0f-9d38-607651de22d0 None None] Generated node identity 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f [ 644.090519] env[68617]: INFO nova.virt.node [None req-82063752-0f5a-4c0f-9d38-607651de22d0 None None] Wrote node identity 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f to /opt/stack/data/n-cpu-1/compute_id [ 644.103939] env[68617]: WARNING nova.compute.manager [None req-82063752-0f5a-4c0f-9d38-607651de22d0 None None] Compute nodes ['5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 644.138726] env[68617]: INFO nova.compute.manager [None req-82063752-0f5a-4c0f-9d38-607651de22d0 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 644.160756] env[68617]: WARNING nova.compute.manager [None req-82063752-0f5a-4c0f-9d38-607651de22d0 None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 644.160991] env[68617]: DEBUG oslo_concurrency.lockutils [None req-82063752-0f5a-4c0f-9d38-607651de22d0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 644.161222] env[68617]: DEBUG oslo_concurrency.lockutils [None req-82063752-0f5a-4c0f-9d38-607651de22d0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 644.161371] env[68617]: DEBUG oslo_concurrency.lockutils [None req-82063752-0f5a-4c0f-9d38-607651de22d0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 644.161530] env[68617]: DEBUG nova.compute.resource_tracker [None req-82063752-0f5a-4c0f-9d38-607651de22d0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 644.162672] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c543b44d-9716-422f-afce-6427dc73697a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 644.171423] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ea1eb29-c4cb-4e82-aad3-0a91259fc561 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 644.184977] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81fb380d-12f9-423d-869e-9bdb240fa6ab {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 644.191011] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-952196e1-aade-44b4-a855-8c3de2292c36 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 644.220534] env[68617]: DEBUG nova.compute.resource_tracker [None req-82063752-0f5a-4c0f-9d38-607651de22d0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180939MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 644.220644] env[68617]: DEBUG oslo_concurrency.lockutils [None req-82063752-0f5a-4c0f-9d38-607651de22d0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 644.220820] env[68617]: DEBUG oslo_concurrency.lockutils [None req-82063752-0f5a-4c0f-9d38-607651de22d0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 644.231904] env[68617]: WARNING nova.compute.resource_tracker [None req-82063752-0f5a-4c0f-9d38-607651de22d0 None None] No compute node record for cpu-1:5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f could not be found. [ 644.244810] env[68617]: INFO nova.compute.resource_tracker [None req-82063752-0f5a-4c0f-9d38-607651de22d0 None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f [ 644.296783] env[68617]: DEBUG nova.compute.resource_tracker [None req-82063752-0f5a-4c0f-9d38-607651de22d0 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 644.297035] env[68617]: DEBUG nova.compute.resource_tracker [None req-82063752-0f5a-4c0f-9d38-607651de22d0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 644.405523] env[68617]: INFO nova.scheduler.client.report [None req-82063752-0f5a-4c0f-9d38-607651de22d0 None None] [req-961ad289-dbf1-452e-bb25-3c73fa1ac7eb] Created resource provider record via placement API for resource provider with UUID 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 644.422933] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1415f08-a778-4987-b17c-88bf9fc06fae {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 644.430889] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a69d783c-f8d0-431a-be5e-89948039512f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 644.459922] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c8c2ab4-1c41-4336-931d-4efac05ac7ea {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 644.467013] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c9259c7-2665-4efe-983e-64cf0365b6e1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 644.479910] env[68617]: DEBUG nova.compute.provider_tree [None req-82063752-0f5a-4c0f-9d38-607651de22d0 None None] Updating inventory in ProviderTree for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 644.516286] env[68617]: DEBUG nova.scheduler.client.report [None req-82063752-0f5a-4c0f-9d38-607651de22d0 None None] Updated inventory for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 644.516515] env[68617]: DEBUG nova.compute.provider_tree [None req-82063752-0f5a-4c0f-9d38-607651de22d0 None None] Updating resource provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f generation from 0 to 1 during operation: update_inventory {{(pid=68617) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 644.516660] env[68617]: DEBUG nova.compute.provider_tree [None req-82063752-0f5a-4c0f-9d38-607651de22d0 None None] Updating inventory in ProviderTree for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 644.566814] env[68617]: DEBUG nova.compute.provider_tree [None req-82063752-0f5a-4c0f-9d38-607651de22d0 None None] Updating resource provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f generation from 1 to 2 during operation: update_traits {{(pid=68617) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 644.583493] env[68617]: DEBUG nova.compute.resource_tracker [None req-82063752-0f5a-4c0f-9d38-607651de22d0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 644.583681] env[68617]: DEBUG oslo_concurrency.lockutils [None req-82063752-0f5a-4c0f-9d38-607651de22d0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.363s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 644.583843] env[68617]: DEBUG nova.service [None req-82063752-0f5a-4c0f-9d38-607651de22d0 None None] Creating RPC server for service compute {{(pid=68617) start /opt/stack/nova/nova/service.py:182}} [ 644.600405] env[68617]: DEBUG nova.service [None req-82063752-0f5a-4c0f-9d38-607651de22d0 None None] Join ServiceGroup membership for this service compute {{(pid=68617) start /opt/stack/nova/nova/service.py:199}} [ 644.600607] env[68617]: DEBUG nova.servicegroup.drivers.db [None req-82063752-0f5a-4c0f-9d38-607651de22d0 None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=68617) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 653.917741] env[68617]: DEBUG dbcounter [-] [68617] Writing DB stats nova_cell0:SELECT=1 {{(pid=68617) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 653.919465] env[68617]: DEBUG dbcounter [-] [68617] Writing DB stats nova_cell1:SELECT=1 {{(pid=68617) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 687.748409] env[68617]: DEBUG oslo_concurrency.lockutils [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Acquiring lock "b5707ff5-916e-49ce-9aac-9a08ac51bdf2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 687.748409] env[68617]: DEBUG oslo_concurrency.lockutils [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Lock "b5707ff5-916e-49ce-9aac-9a08ac51bdf2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 687.786158] env[68617]: DEBUG nova.compute.manager [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 687.927015] env[68617]: DEBUG oslo_concurrency.lockutils [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 687.927015] env[68617]: DEBUG oslo_concurrency.lockutils [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 687.927015] env[68617]: INFO nova.compute.claims [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 688.070422] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7e21448-160f-4aaf-be45-e5694912537d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 688.087455] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e54f3482-6f72-4d16-8323-a6bb8ef73fd7 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 688.124916] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa7d0845-e0ec-4f28-b152-9c6105e99aca {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 688.133112] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c38b3120-e3f5-4617-8501-144a29f83792 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 688.148133] env[68617]: DEBUG nova.compute.provider_tree [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 688.163963] env[68617]: DEBUG nova.scheduler.client.report [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 688.203207] env[68617]: DEBUG oslo_concurrency.lockutils [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.278s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 688.203207] env[68617]: DEBUG nova.compute.manager [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 688.265044] env[68617]: DEBUG nova.compute.utils [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 688.269952] env[68617]: DEBUG nova.compute.manager [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 688.269952] env[68617]: DEBUG nova.network.neutron [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 688.310684] env[68617]: DEBUG nova.compute.manager [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 688.423962] env[68617]: DEBUG nova.compute.manager [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 688.605380] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._sync_power_states {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 688.630420] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Getting list of instances from cluster (obj){ [ 688.630420] env[68617]: value = "domain-c8" [ 688.630420] env[68617]: _type = "ClusterComputeResource" [ 688.630420] env[68617]: } {{(pid=68617) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 688.631951] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0021305-26f5-4b65-ae62-355fb1d33814 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 688.646804] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Got total of 0 instances {{(pid=68617) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 688.646978] env[68617]: WARNING nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] While synchronizing instance power states, found 1 instances in the database and 0 instances on the hypervisor. [ 688.647764] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Triggering sync for uuid b5707ff5-916e-49ce-9aac-9a08ac51bdf2 {{(pid=68617) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 688.649519] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "b5707ff5-916e-49ce-9aac-9a08ac51bdf2" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 688.649741] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 688.650088] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Getting list of instances from cluster (obj){ [ 688.650088] env[68617]: value = "domain-c8" [ 688.650088] env[68617]: _type = "ClusterComputeResource" [ 688.650088] env[68617]: } {{(pid=68617) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 688.651196] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28e5f876-bf02-43c8-ba09-11dddf5f13f0 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 688.661805] env[68617]: DEBUG nova.virt.hardware [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 688.662226] env[68617]: DEBUG nova.virt.hardware [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 688.662489] env[68617]: DEBUG nova.virt.hardware [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 688.663402] env[68617]: DEBUG nova.virt.hardware [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 688.663402] env[68617]: DEBUG nova.virt.hardware [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 688.663402] env[68617]: DEBUG nova.virt.hardware [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 688.663521] env[68617]: DEBUG nova.virt.hardware [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 688.663611] env[68617]: DEBUG nova.virt.hardware [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 688.663951] env[68617]: DEBUG nova.virt.hardware [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 688.664169] env[68617]: DEBUG nova.virt.hardware [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 688.664657] env[68617]: DEBUG nova.virt.hardware [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 688.666591] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b12f09c-f799-451c-bf52-1ed5663a4d3b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 688.673541] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Got total of 0 instances {{(pid=68617) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 688.688927] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59f5294b-3317-4157-b953-ef43c46e96cb {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 688.708120] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88162cbe-681f-44b6-bf01-28a6f8d5cbe2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 688.751652] env[68617]: DEBUG nova.policy [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bf27da207b70409481d642d2424d0264', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '10e0bbefce664b53ab6dff0effeb96ee', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 689.606116] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Acquiring lock "26f6016f-5fb5-4fd2-9ee3-648297d969b3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 689.607197] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Lock "26f6016f-5fb5-4fd2-9ee3-648297d969b3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 689.636067] env[68617]: DEBUG nova.compute.manager [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 689.747212] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 689.747453] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 689.754148] env[68617]: INFO nova.compute.claims [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 689.914964] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4f62f6d-7fa7-49a1-82f7-48174651fa16 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 689.924090] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41db8bf2-6f61-452d-a9e7-ddf141a02be5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 689.958906] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51a8e535-cb44-4c6b-882b-c55c89e9b546 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 689.967391] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b3ca62c-fa3a-4eae-a51a-92c02dfffaa2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 689.989113] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Acquiring lock "c507115c-92a0-4513-aae8-7dc8f95bc0ea" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 689.989113] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Lock "c507115c-92a0-4513-aae8-7dc8f95bc0ea" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 689.989934] env[68617]: DEBUG nova.compute.provider_tree [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 690.014238] env[68617]: DEBUG nova.scheduler.client.report [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 690.022779] env[68617]: DEBUG nova.compute.manager [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 690.038574] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.288s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 690.038775] env[68617]: DEBUG nova.compute.manager [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 690.090098] env[68617]: DEBUG nova.compute.utils [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 690.094012] env[68617]: DEBUG nova.compute.manager [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Not allocating networking since 'none' was specified. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 690.109943] env[68617]: DEBUG nova.compute.manager [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 690.117385] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 690.117385] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 690.118608] env[68617]: INFO nova.compute.claims [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 690.240658] env[68617]: DEBUG nova.compute.manager [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 690.277601] env[68617]: DEBUG nova.virt.hardware [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 690.277954] env[68617]: DEBUG nova.virt.hardware [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 690.278107] env[68617]: DEBUG nova.virt.hardware [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 690.278595] env[68617]: DEBUG nova.virt.hardware [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 690.278595] env[68617]: DEBUG nova.virt.hardware [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 690.278759] env[68617]: DEBUG nova.virt.hardware [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 690.278917] env[68617]: DEBUG nova.virt.hardware [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 690.279592] env[68617]: DEBUG nova.virt.hardware [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 690.279592] env[68617]: DEBUG nova.virt.hardware [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 690.279592] env[68617]: DEBUG nova.virt.hardware [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 690.279758] env[68617]: DEBUG nova.virt.hardware [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 690.281274] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e8a0a1d-fc16-4839-ad79-8ae57a1c0d94 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 690.291254] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c680854e-f169-4a11-8d33-a063a2257012 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 690.299890] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f95af806-76ef-4ab3-9272-53cc4785776d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 690.316709] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Instance VIF info [] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 690.325960] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 690.326611] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5f586bcd-33f8-43c2-88fd-2589fbf2f206 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 690.328963] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c869e6bc-6cfe-4435-80dd-11c20ac0f738 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 690.366368] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-badf1ab9-3d92-4d19-9a64-1b135f5dafc5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 690.369440] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Created folder: OpenStack in parent group-v4. [ 690.369512] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Creating folder: Project (91a9cad95f8649718f4d61b32b2c36f6). Parent ref: group-v693691. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 690.369739] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f7eff0d7-aa12-4e9d-8cf5-d8c1dc7dc0f3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 690.380734] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-225f9386-328d-4a66-ba64-9771d8b86dcb {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 690.384688] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Created folder: Project (91a9cad95f8649718f4d61b32b2c36f6) in parent group-v693691. [ 690.385065] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Creating folder: Instances. Parent ref: group-v693692. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 690.385996] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b6bb6526-34b9-4b80-8ac8-02f596ccefad {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 690.397386] env[68617]: DEBUG nova.compute.provider_tree [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 690.402038] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Created folder: Instances in parent group-v693692. [ 690.402038] env[68617]: DEBUG oslo.service.loopingcall [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 690.402038] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 690.402038] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-715103e9-07a4-4a31-958c-de831b59ef07 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 690.414546] env[68617]: DEBUG nova.scheduler.client.report [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 690.425749] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 690.425749] env[68617]: value = "task-3470686" [ 690.425749] env[68617]: _type = "Task" [ 690.425749] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 690.436693] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470686, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 690.446506] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.327s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 690.446506] env[68617]: DEBUG nova.compute.manager [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 690.512468] env[68617]: DEBUG nova.compute.utils [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 690.513859] env[68617]: DEBUG nova.compute.manager [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 690.514649] env[68617]: DEBUG nova.network.neutron [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 690.536702] env[68617]: DEBUG nova.compute.manager [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 690.631272] env[68617]: DEBUG nova.compute.manager [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 690.665717] env[68617]: DEBUG nova.virt.hardware [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 690.665983] env[68617]: DEBUG nova.virt.hardware [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 690.666324] env[68617]: DEBUG nova.virt.hardware [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 690.666413] env[68617]: DEBUG nova.virt.hardware [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 690.666565] env[68617]: DEBUG nova.virt.hardware [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 690.666736] env[68617]: DEBUG nova.virt.hardware [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 690.666992] env[68617]: DEBUG nova.virt.hardware [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 690.667273] env[68617]: DEBUG nova.virt.hardware [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 690.667417] env[68617]: DEBUG nova.virt.hardware [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 690.667703] env[68617]: DEBUG nova.virt.hardware [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 690.667841] env[68617]: DEBUG nova.virt.hardware [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 690.669450] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16e81581-a4c6-4d75-8d4b-ee3aef8ba08d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 690.683992] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49da5266-e4dc-495e-a6cc-664f00bb5f21 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 690.939183] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470686, 'name': CreateVM_Task, 'duration_secs': 0.315012} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 690.942016] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 690.942016] env[68617]: DEBUG oslo_vmware.service [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c771ae8-3bdd-446d-93c4-c37b15bc28bf {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 690.952034] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 690.952034] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 690.952034] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 690.952034] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3cc16cc9-ca66-40f9-9475-364d63e6fd05 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 690.960254] env[68617]: DEBUG oslo_vmware.api [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Waiting for the task: (returnval){ [ 690.960254] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52dc0422-f119-4571-4136-02c354f42ed5" [ 690.960254] env[68617]: _type = "Task" [ 690.960254] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 690.968270] env[68617]: DEBUG oslo_vmware.api [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52dc0422-f119-4571-4136-02c354f42ed5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 691.038136] env[68617]: DEBUG nova.policy [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c62c85f2265c422193e6e03eff97e160', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b2c347ad328349a29dd82c87395ba43e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 691.256281] env[68617]: DEBUG nova.network.neutron [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Successfully created port: 52954f88-8663-4161-97b3-78be5e072d67 {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 691.474368] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 691.474663] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 691.474859] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 691.475044] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 691.475474] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 691.475870] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-607ff165-1290-4872-b7e2-25b44f9cb933 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 691.528859] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 691.528959] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 691.529761] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a37fb276-ce26-43df-90a5-8b8a8c2d7c15 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 691.544747] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-edc2681c-979a-410b-8148-b54d6193dbe8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 691.552191] env[68617]: DEBUG oslo_vmware.api [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Waiting for the task: (returnval){ [ 691.552191] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]5247c3a0-de16-3a48-110a-cf919631b47e" [ 691.552191] env[68617]: _type = "Task" [ 691.552191] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 691.565262] env[68617]: DEBUG oslo_vmware.api [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]5247c3a0-de16-3a48-110a-cf919631b47e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 691.891848] env[68617]: DEBUG nova.network.neutron [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Successfully created port: 97d01f6d-8cc5-4382-9b4d-c38bef6bd469 {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 692.072431] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 692.072431] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Creating directory with path [datastore2] vmware_temp/135b78cd-b10a-4932-b77a-56939c307f08/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 692.072431] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2e51a4e9-a889-4c1a-9b67-4f16e1de479d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 692.094021] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Created directory with path [datastore2] vmware_temp/135b78cd-b10a-4932-b77a-56939c307f08/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 692.094920] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Fetch image to [datastore2] vmware_temp/135b78cd-b10a-4932-b77a-56939c307f08/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 692.095405] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/135b78cd-b10a-4932-b77a-56939c307f08/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 692.098253] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3a706ca-9ae6-4e11-af75-32007779115e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 692.106709] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0052333a-93cf-436d-beef-b390e6b59594 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 692.120734] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23c029d1-57fe-4a39-83d1-246f8571e711 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 692.159288] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa3f8124-33c4-4f45-9b1e-5e37f996ab0f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 692.167558] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-66568b11-1dde-4f54-8db3-cbd7ae97829e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 692.190025] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 692.281939] env[68617]: DEBUG oslo_vmware.rw_handles [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/135b78cd-b10a-4932-b77a-56939c307f08/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 692.355672] env[68617]: DEBUG oslo_vmware.rw_handles [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 692.357925] env[68617]: DEBUG oslo_vmware.rw_handles [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/135b78cd-b10a-4932-b77a-56939c307f08/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 692.645585] env[68617]: DEBUG oslo_concurrency.lockutils [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Acquiring lock "5f4991a3-c34b-45b1-a3af-94d7d990eef1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 692.645840] env[68617]: DEBUG oslo_concurrency.lockutils [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Lock "5f4991a3-c34b-45b1-a3af-94d7d990eef1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 692.664759] env[68617]: DEBUG nova.compute.manager [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 692.735875] env[68617]: DEBUG oslo_concurrency.lockutils [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 692.735875] env[68617]: DEBUG oslo_concurrency.lockutils [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 692.736957] env[68617]: INFO nova.compute.claims [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 692.890659] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ce4525e-f232-4bd8-a4a6-21e2b14d5a4b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 692.898287] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e84bf18c-17f5-4590-986f-49ff9135882d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 692.932625] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-564fe16d-7d79-41b7-8f35-bbe70129fabe {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 692.941506] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96dbec0d-e326-4c75-87db-5ba20b2c4430 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 692.957330] env[68617]: DEBUG nova.compute.provider_tree [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 692.967912] env[68617]: DEBUG nova.scheduler.client.report [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 692.990402] env[68617]: DEBUG oslo_concurrency.lockutils [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.255s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 692.990903] env[68617]: DEBUG nova.compute.manager [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 693.030666] env[68617]: DEBUG nova.compute.utils [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 693.031879] env[68617]: DEBUG nova.compute.manager [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 693.033659] env[68617]: DEBUG nova.network.neutron [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 693.046365] env[68617]: DEBUG nova.compute.manager [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 693.128631] env[68617]: DEBUG nova.compute.manager [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 693.171322] env[68617]: DEBUG nova.virt.hardware [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 693.171322] env[68617]: DEBUG nova.virt.hardware [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 693.171322] env[68617]: DEBUG nova.virt.hardware [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 693.171628] env[68617]: DEBUG nova.virt.hardware [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 693.171628] env[68617]: DEBUG nova.virt.hardware [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 693.171628] env[68617]: DEBUG nova.virt.hardware [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 693.171628] env[68617]: DEBUG nova.virt.hardware [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 693.171628] env[68617]: DEBUG nova.virt.hardware [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 693.171873] env[68617]: DEBUG nova.virt.hardware [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 693.172191] env[68617]: DEBUG nova.virt.hardware [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 693.172862] env[68617]: DEBUG nova.virt.hardware [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 693.175468] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ce2838b-8047-42a9-b422-cfdd5f3330df {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 693.188505] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5c91856-a906-4a9c-80a0-149852bb9cc1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 693.273792] env[68617]: DEBUG nova.policy [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '72eb640c49fc45e49f1b4b9c797943cf', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '20cf3ab18e0b4e8d89ae53ed3b01abfc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 694.195860] env[68617]: DEBUG nova.network.neutron [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Successfully updated port: 52954f88-8663-4161-97b3-78be5e072d67 {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 694.219847] env[68617]: DEBUG oslo_concurrency.lockutils [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Acquiring lock "refresh_cache-b5707ff5-916e-49ce-9aac-9a08ac51bdf2" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 694.220020] env[68617]: DEBUG oslo_concurrency.lockutils [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Acquired lock "refresh_cache-b5707ff5-916e-49ce-9aac-9a08ac51bdf2" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 694.220211] env[68617]: DEBUG nova.network.neutron [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 694.424157] env[68617]: DEBUG nova.network.neutron [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 694.650421] env[68617]: DEBUG nova.network.neutron [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Successfully created port: f91cc40a-05e2-40b3-9da5-5186487f847d {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 694.734866] env[68617]: DEBUG nova.network.neutron [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Successfully updated port: 97d01f6d-8cc5-4382-9b4d-c38bef6bd469 {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 694.761392] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Acquiring lock "refresh_cache-c507115c-92a0-4513-aae8-7dc8f95bc0ea" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 694.762250] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Acquired lock "refresh_cache-c507115c-92a0-4513-aae8-7dc8f95bc0ea" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 694.762250] env[68617]: DEBUG nova.network.neutron [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 695.007151] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Acquiring lock "b95883b2-0366-4f52-bdf2-aa6259fafc58" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 695.007728] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Lock "b95883b2-0366-4f52-bdf2-aa6259fafc58" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 695.030576] env[68617]: DEBUG nova.compute.manager [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 695.054854] env[68617]: DEBUG nova.network.neutron [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 695.120351] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 695.120351] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 695.126222] env[68617]: INFO nova.compute.claims [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 695.362363] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9329ab4b-2b74-47e6-bac6-c4475c3a189f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 695.374599] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0da691a7-2b05-4d82-9d13-94293f6a75a8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 695.412095] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4aff685-4421-4e89-867f-157977126c50 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 695.420881] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61d70bec-0dad-4eba-8bda-56e2ec86a724 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 695.442514] env[68617]: DEBUG nova.compute.provider_tree [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 695.469586] env[68617]: DEBUG nova.scheduler.client.report [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 695.496296] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.376s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 695.496802] env[68617]: DEBUG nova.compute.manager [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 695.549021] env[68617]: DEBUG nova.compute.utils [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 695.549021] env[68617]: DEBUG nova.compute.manager [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 695.549021] env[68617]: DEBUG nova.network.neutron [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 695.570056] env[68617]: DEBUG nova.compute.manager [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 695.679290] env[68617]: DEBUG nova.compute.manager [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 695.723573] env[68617]: DEBUG nova.virt.hardware [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 695.723573] env[68617]: DEBUG nova.virt.hardware [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 695.723754] env[68617]: DEBUG nova.virt.hardware [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 695.724408] env[68617]: DEBUG nova.virt.hardware [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 695.724589] env[68617]: DEBUG nova.virt.hardware [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 695.724735] env[68617]: DEBUG nova.virt.hardware [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 695.724968] env[68617]: DEBUG nova.virt.hardware [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 695.725155] env[68617]: DEBUG nova.virt.hardware [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 695.725392] env[68617]: DEBUG nova.virt.hardware [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 695.725495] env[68617]: DEBUG nova.virt.hardware [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 695.725670] env[68617]: DEBUG nova.virt.hardware [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 695.729296] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ddd0286-369c-41e4-8e92-819e6fc24976 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 695.747029] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd4d1cca-164e-41dc-b399-a40953cf2a23 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 696.150471] env[68617]: DEBUG nova.network.neutron [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Updating instance_info_cache with network_info: [{"id": "52954f88-8663-4161-97b3-78be5e072d67", "address": "fa:16:3e:6a:59:10", "network": {"id": "064d41fb-a622-47db-b457-8854c04c637e", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1693772355-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "10e0bbefce664b53ab6dff0effeb96ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f39e3b37-7906-4bbc-820e-ceac74e4d827", "external-id": "nsx-vlan-transportzone-328", "segmentation_id": 328, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap52954f88-86", "ovs_interfaceid": "52954f88-8663-4161-97b3-78be5e072d67", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 696.168075] env[68617]: DEBUG oslo_concurrency.lockutils [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Releasing lock "refresh_cache-b5707ff5-916e-49ce-9aac-9a08ac51bdf2" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 696.168690] env[68617]: DEBUG nova.compute.manager [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Instance network_info: |[{"id": "52954f88-8663-4161-97b3-78be5e072d67", "address": "fa:16:3e:6a:59:10", "network": {"id": "064d41fb-a622-47db-b457-8854c04c637e", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1693772355-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "10e0bbefce664b53ab6dff0effeb96ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f39e3b37-7906-4bbc-820e-ceac74e4d827", "external-id": "nsx-vlan-transportzone-328", "segmentation_id": 328, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap52954f88-86", "ovs_interfaceid": "52954f88-8663-4161-97b3-78be5e072d67", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 696.168927] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6a:59:10', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f39e3b37-7906-4bbc-820e-ceac74e4d827', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '52954f88-8663-4161-97b3-78be5e072d67', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 696.181606] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Creating folder: Project (10e0bbefce664b53ab6dff0effeb96ee). Parent ref: group-v693691. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 696.183425] env[68617]: DEBUG nova.policy [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e40a491a605f4a43bc59247756bac14d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '30d430f92fc448d88707e4bcabd47d82', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 696.188895] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-580253f1-61ac-4ced-9a04-e5b5cffb6c6a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 696.201689] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Created folder: Project (10e0bbefce664b53ab6dff0effeb96ee) in parent group-v693691. [ 696.201929] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Creating folder: Instances. Parent ref: group-v693695. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 696.202179] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-797e5015-e0cb-46ec-88a4-b02be4023fcc {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 696.212109] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Created folder: Instances in parent group-v693695. [ 696.212427] env[68617]: DEBUG oslo.service.loopingcall [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 696.212646] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 696.212876] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-21457501-ba57-44c5-9d0f-5fb4c5f2f9c5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 696.236563] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 696.236563] env[68617]: value = "task-3470689" [ 696.236563] env[68617]: _type = "Task" [ 696.236563] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 696.245506] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470689, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 696.751878] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470689, 'name': CreateVM_Task, 'duration_secs': 0.349833} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 696.751878] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 696.759050] env[68617]: DEBUG nova.network.neutron [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Updating instance_info_cache with network_info: [{"id": "97d01f6d-8cc5-4382-9b4d-c38bef6bd469", "address": "fa:16:3e:17:12:9f", "network": {"id": "e3aee9db-8596-4ea8-943e-5c365382ee22", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.155", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "f1a3ab6230dd468b8019424ce71de8ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cde23701-02ca-4cb4-b5a6-d321f8ac9660", "external-id": "nsx-vlan-transportzone-586", "segmentation_id": 586, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap97d01f6d-8c", "ovs_interfaceid": "97d01f6d-8cc5-4382-9b4d-c38bef6bd469", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 696.788447] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Releasing lock "refresh_cache-c507115c-92a0-4513-aae8-7dc8f95bc0ea" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 696.789523] env[68617]: DEBUG nova.compute.manager [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Instance network_info: |[{"id": "97d01f6d-8cc5-4382-9b4d-c38bef6bd469", "address": "fa:16:3e:17:12:9f", "network": {"id": "e3aee9db-8596-4ea8-943e-5c365382ee22", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.155", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "f1a3ab6230dd468b8019424ce71de8ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cde23701-02ca-4cb4-b5a6-d321f8ac9660", "external-id": "nsx-vlan-transportzone-586", "segmentation_id": 586, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap97d01f6d-8c", "ovs_interfaceid": "97d01f6d-8cc5-4382-9b4d-c38bef6bd469", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 696.794366] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:17:12:9f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'cde23701-02ca-4cb4-b5a6-d321f8ac9660', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '97d01f6d-8cc5-4382-9b4d-c38bef6bd469', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 696.804398] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Creating folder: Project (b2c347ad328349a29dd82c87395ba43e). Parent ref: group-v693691. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 696.806245] env[68617]: DEBUG oslo_concurrency.lockutils [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 696.810148] env[68617]: DEBUG oslo_concurrency.lockutils [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 696.810148] env[68617]: DEBUG oslo_concurrency.lockutils [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 696.810148] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1e1b4ffc-4f14-49ae-b276-f4caeb70b7ec {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 696.811663] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f5d4ff1d-0922-4daa-98a4-3e13df9c5b99 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 696.820416] env[68617]: DEBUG oslo_vmware.api [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Waiting for the task: (returnval){ [ 696.820416] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]5216bfb8-c3ee-4d34-e501-f04087bc31c7" [ 696.820416] env[68617]: _type = "Task" [ 696.820416] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 696.824943] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Created folder: Project (b2c347ad328349a29dd82c87395ba43e) in parent group-v693691. [ 696.825145] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Creating folder: Instances. Parent ref: group-v693698. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 696.825715] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c68a18fc-a609-41da-a402-75aeb3d4df5e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 696.835079] env[68617]: DEBUG oslo_vmware.api [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]5216bfb8-c3ee-4d34-e501-f04087bc31c7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 696.844968] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Created folder: Instances in parent group-v693698. [ 696.845231] env[68617]: DEBUG oslo.service.loopingcall [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 696.845429] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 696.845631] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-74d1376d-6c4d-449c-9cbb-d234b3a94f4b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 696.865755] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 696.865755] env[68617]: value = "task-3470692" [ 696.865755] env[68617]: _type = "Task" [ 696.865755] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 696.879759] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470692, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 697.331941] env[68617]: DEBUG oslo_concurrency.lockutils [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 697.332172] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 697.332386] env[68617]: DEBUG oslo_concurrency.lockutils [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 697.378224] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470692, 'name': CreateVM_Task, 'duration_secs': 0.349388} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 697.378224] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 697.379245] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 697.384727] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 697.384727] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 697.384727] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c1b14d4e-c65c-4851-9346-df94b1b8daf3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 697.389033] env[68617]: DEBUG oslo_vmware.api [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Waiting for the task: (returnval){ [ 697.389033] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]5201a08c-1cc0-df4b-4868-7f983a97708d" [ 697.389033] env[68617]: _type = "Task" [ 697.389033] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 697.398803] env[68617]: DEBUG oslo_vmware.api [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]5201a08c-1cc0-df4b-4868-7f983a97708d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 697.717314] env[68617]: DEBUG nova.compute.manager [req-df26284c-a8a2-470a-bc12-a4a84a979472 req-17164ccd-5ecd-4330-be4e-8395365f6a3f service nova] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Received event network-vif-plugged-52954f88-8663-4161-97b3-78be5e072d67 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 697.717519] env[68617]: DEBUG oslo_concurrency.lockutils [req-df26284c-a8a2-470a-bc12-a4a84a979472 req-17164ccd-5ecd-4330-be4e-8395365f6a3f service nova] Acquiring lock "b5707ff5-916e-49ce-9aac-9a08ac51bdf2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 697.717713] env[68617]: DEBUG oslo_concurrency.lockutils [req-df26284c-a8a2-470a-bc12-a4a84a979472 req-17164ccd-5ecd-4330-be4e-8395365f6a3f service nova] Lock "b5707ff5-916e-49ce-9aac-9a08ac51bdf2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 697.717919] env[68617]: DEBUG oslo_concurrency.lockutils [req-df26284c-a8a2-470a-bc12-a4a84a979472 req-17164ccd-5ecd-4330-be4e-8395365f6a3f service nova] Lock "b5707ff5-916e-49ce-9aac-9a08ac51bdf2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 697.718034] env[68617]: DEBUG nova.compute.manager [req-df26284c-a8a2-470a-bc12-a4a84a979472 req-17164ccd-5ecd-4330-be4e-8395365f6a3f service nova] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] No waiting events found dispatching network-vif-plugged-52954f88-8663-4161-97b3-78be5e072d67 {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 697.718195] env[68617]: WARNING nova.compute.manager [req-df26284c-a8a2-470a-bc12-a4a84a979472 req-17164ccd-5ecd-4330-be4e-8395365f6a3f service nova] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Received unexpected event network-vif-plugged-52954f88-8663-4161-97b3-78be5e072d67 for instance with vm_state building and task_state spawning. [ 697.900131] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 697.900485] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 697.900807] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 697.947729] env[68617]: DEBUG nova.network.neutron [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Successfully created port: 6d992fe3-b0e7-4342-b724-e5fc24e07d7d {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 698.644194] env[68617]: DEBUG nova.network.neutron [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Successfully updated port: f91cc40a-05e2-40b3-9da5-5186487f847d {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 698.662828] env[68617]: DEBUG oslo_concurrency.lockutils [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Acquiring lock "refresh_cache-5f4991a3-c34b-45b1-a3af-94d7d990eef1" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 698.663861] env[68617]: DEBUG oslo_concurrency.lockutils [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Acquired lock "refresh_cache-5f4991a3-c34b-45b1-a3af-94d7d990eef1" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 698.663861] env[68617]: DEBUG nova.network.neutron [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 698.854337] env[68617]: DEBUG nova.network.neutron [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 699.858524] env[68617]: DEBUG nova.network.neutron [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Updating instance_info_cache with network_info: [{"id": "f91cc40a-05e2-40b3-9da5-5186487f847d", "address": "fa:16:3e:a9:07:39", "network": {"id": "cc1d083d-f23c-4bc6-9c94-f9271465e167", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1147907742-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "20cf3ab18e0b4e8d89ae53ed3b01abfc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d4ef133-b6f3-41d1-add4-92a1482195cf", "external-id": "nsx-vlan-transportzone-446", "segmentation_id": 446, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf91cc40a-05", "ovs_interfaceid": "f91cc40a-05e2-40b3-9da5-5186487f847d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 699.874345] env[68617]: DEBUG oslo_concurrency.lockutils [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Releasing lock "refresh_cache-5f4991a3-c34b-45b1-a3af-94d7d990eef1" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 699.875276] env[68617]: DEBUG nova.compute.manager [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Instance network_info: |[{"id": "f91cc40a-05e2-40b3-9da5-5186487f847d", "address": "fa:16:3e:a9:07:39", "network": {"id": "cc1d083d-f23c-4bc6-9c94-f9271465e167", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1147907742-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "20cf3ab18e0b4e8d89ae53ed3b01abfc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d4ef133-b6f3-41d1-add4-92a1482195cf", "external-id": "nsx-vlan-transportzone-446", "segmentation_id": 446, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf91cc40a-05", "ovs_interfaceid": "f91cc40a-05e2-40b3-9da5-5186487f847d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 699.877219] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a9:07:39', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6d4ef133-b6f3-41d1-add4-92a1482195cf', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'f91cc40a-05e2-40b3-9da5-5186487f847d', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 699.886050] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Creating folder: Project (20cf3ab18e0b4e8d89ae53ed3b01abfc). Parent ref: group-v693691. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 699.886050] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e2ddeb9b-4c5c-435b-9a8e-a7d53fe1d0ea {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 699.896881] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Created folder: Project (20cf3ab18e0b4e8d89ae53ed3b01abfc) in parent group-v693691. [ 699.897468] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Creating folder: Instances. Parent ref: group-v693701. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 699.897468] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-be801f80-10e5-4f66-bf55-989b9d3d2818 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 699.912172] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Created folder: Instances in parent group-v693701. [ 699.912172] env[68617]: DEBUG oslo.service.loopingcall [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 699.912172] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 699.912172] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-518a72d0-fb34-49c4-a445-6f40bee3cd46 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 699.934529] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 699.934529] env[68617]: value = "task-3470695" [ 699.934529] env[68617]: _type = "Task" [ 699.934529] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 699.944238] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470695, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 700.214923] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Acquiring lock "050e2b27-1311-4a9a-b5cf-6bc2f7128eba" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 700.215795] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Lock "050e2b27-1311-4a9a-b5cf-6bc2f7128eba" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 700.231816] env[68617]: DEBUG nova.compute.manager [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 700.326023] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 700.326023] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 700.327674] env[68617]: INFO nova.compute.claims [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 700.448337] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470695, 'name': CreateVM_Task, 'duration_secs': 0.328819} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 700.451526] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 700.453780] env[68617]: DEBUG oslo_concurrency.lockutils [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 700.453908] env[68617]: DEBUG oslo_concurrency.lockutils [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 700.454231] env[68617]: DEBUG oslo_concurrency.lockutils [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 700.454481] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-285bb7c1-5a11-45a6-9301-807c6bc24ada {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 700.462223] env[68617]: DEBUG oslo_vmware.api [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Waiting for the task: (returnval){ [ 700.462223] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52fa4ff5-9300-d281-448b-e5aa4e591e26" [ 700.462223] env[68617]: _type = "Task" [ 700.462223] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 700.478734] env[68617]: DEBUG oslo_concurrency.lockutils [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 700.478973] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 700.479232] env[68617]: DEBUG oslo_concurrency.lockutils [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 700.516086] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d48da3bb-2d7c-40e6-a6e2-8b3fde0b4e4a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 700.525125] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c24ed5e2-4a36-4ca2-bdfe-e30ea3371385 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 700.563648] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bee72873-7aed-4cc4-a331-c21873f273ad {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 700.574336] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52a4b3ae-3c1e-4a39-972c-cb7c0bb64594 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 700.590506] env[68617]: DEBUG nova.compute.provider_tree [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 700.597285] env[68617]: DEBUG oslo_concurrency.lockutils [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquiring lock "f13242a0-7e65-4d68-a317-16fb8c4b8f8a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 700.597542] env[68617]: DEBUG oslo_concurrency.lockutils [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Lock "f13242a0-7e65-4d68-a317-16fb8c4b8f8a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 700.604963] env[68617]: DEBUG nova.scheduler.client.report [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 700.618655] env[68617]: DEBUG nova.compute.manager [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 700.624031] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.299s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 700.624139] env[68617]: DEBUG nova.compute.manager [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 700.671821] env[68617]: DEBUG nova.compute.utils [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 700.673104] env[68617]: DEBUG nova.compute.manager [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 700.673296] env[68617]: DEBUG nova.network.neutron [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 700.692454] env[68617]: DEBUG nova.compute.manager [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 700.698174] env[68617]: DEBUG oslo_concurrency.lockutils [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 700.698174] env[68617]: DEBUG oslo_concurrency.lockutils [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 700.699112] env[68617]: INFO nova.compute.claims [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 700.722463] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 700.722767] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 700.725156] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Starting heal instance info cache {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 700.725293] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Rebuilding the list of instances to heal {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 700.774117] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 700.774117] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 700.774117] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 700.774117] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 700.774117] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 700.774431] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 700.774431] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 700.774431] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Didn't find any instances for network info cache update. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 700.774431] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 700.774431] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 700.774431] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 700.774622] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 700.774705] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 700.776736] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 700.776736] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68617) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 700.776736] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 700.789568] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 700.848820] env[68617]: DEBUG nova.compute.manager [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 700.887112] env[68617]: DEBUG nova.virt.hardware [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 700.887578] env[68617]: DEBUG nova.virt.hardware [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 700.887749] env[68617]: DEBUG nova.virt.hardware [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 700.887942] env[68617]: DEBUG nova.virt.hardware [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 700.888099] env[68617]: DEBUG nova.virt.hardware [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 700.888244] env[68617]: DEBUG nova.virt.hardware [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 700.888446] env[68617]: DEBUG nova.virt.hardware [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 700.888595] env[68617]: DEBUG nova.virt.hardware [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 700.888760] env[68617]: DEBUG nova.virt.hardware [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 700.888915] env[68617]: DEBUG nova.virt.hardware [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 700.889093] env[68617]: DEBUG nova.virt.hardware [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 700.890266] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe8c998b-4e23-4fee-9554-ef1bd8bae9fd {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 700.902293] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-988ca056-34ff-4a42-8f57-f9fab52408b4 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 700.937221] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a936992b-6acc-406c-8252-4d11b2f31b66 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 700.945395] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1acd15b5-ad64-4645-9faf-b0f15e1d7d55 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 700.980943] env[68617]: DEBUG nova.policy [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'aa98a00a71c942828f98feb0ec7de637', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '11dbafd8e6f143718a82d40b45d1e021', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 700.983276] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ae6e6bd-e3bb-4300-940a-d13392a98701 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 700.992275] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c86cbcb-262c-4032-a7eb-d33445759903 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.006295] env[68617]: DEBUG nova.compute.provider_tree [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 701.016822] env[68617]: DEBUG nova.scheduler.client.report [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 701.034268] env[68617]: DEBUG oslo_concurrency.lockutils [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.336s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 701.034453] env[68617]: DEBUG nova.compute.manager [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 701.037535] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.248s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 701.040891] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 701.040891] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 701.040891] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-343c3e67-b977-4cfc-abe9-99ad658e7990 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.051627] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08b1ca2a-637c-47da-8625-beb991ec0780 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.072854] env[68617]: DEBUG nova.network.neutron [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Successfully updated port: 6d992fe3-b0e7-4342-b724-e5fc24e07d7d {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 701.074540] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c879a5d6-f4d9-4ab5-a897-774330f05093 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.084037] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21061b95-0795-4510-83cc-c240fb4d0eac {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.089836] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Acquiring lock "refresh_cache-b95883b2-0366-4f52-bdf2-aa6259fafc58" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 701.090308] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Acquired lock "refresh_cache-b95883b2-0366-4f52-bdf2-aa6259fafc58" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 701.090308] env[68617]: DEBUG nova.network.neutron [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 701.128386] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180937MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 701.128386] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 701.128386] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 701.133252] env[68617]: DEBUG nova.compute.utils [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 701.137856] env[68617]: DEBUG nova.compute.manager [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 701.137856] env[68617]: DEBUG nova.network.neutron [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 701.152257] env[68617]: DEBUG nova.compute.manager [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 701.182376] env[68617]: DEBUG nova.network.neutron [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 701.286996] env[68617]: DEBUG nova.compute.manager [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 701.301347] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b5707ff5-916e-49ce-9aac-9a08ac51bdf2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 701.301460] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 26f6016f-5fb5-4fd2-9ee3-648297d969b3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 701.301554] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance c507115c-92a0-4513-aae8-7dc8f95bc0ea actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 701.301712] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 5f4991a3-c34b-45b1-a3af-94d7d990eef1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 701.302370] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b95883b2-0366-4f52-bdf2-aa6259fafc58 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 701.302370] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 050e2b27-1311-4a9a-b5cf-6bc2f7128eba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 701.302370] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f13242a0-7e65-4d68-a317-16fb8c4b8f8a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 701.302370] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 701.302546] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 701.385126] env[68617]: DEBUG nova.virt.hardware [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 701.385126] env[68617]: DEBUG nova.virt.hardware [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 701.385126] env[68617]: DEBUG nova.virt.hardware [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 701.385865] env[68617]: DEBUG nova.virt.hardware [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 701.385865] env[68617]: DEBUG nova.virt.hardware [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 701.385865] env[68617]: DEBUG nova.virt.hardware [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 701.385865] env[68617]: DEBUG nova.virt.hardware [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 701.385865] env[68617]: DEBUG nova.virt.hardware [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 701.386263] env[68617]: DEBUG nova.virt.hardware [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 701.386263] env[68617]: DEBUG nova.virt.hardware [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 701.386263] env[68617]: DEBUG nova.virt.hardware [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 701.386263] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d1a40b8-b8e8-419a-8607-dd4d27365c71 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.386263] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45254a77-ef12-44b5-94cd-82336d6301b7 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.413611] env[68617]: DEBUG nova.compute.manager [req-14377388-2cd0-4151-9e1f-df69667b8b3e req-2dee930d-f0b9-4bac-a47a-6c6c0db9afc4 service nova] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Received event network-vif-plugged-f91cc40a-05e2-40b3-9da5-5186487f847d {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 701.413835] env[68617]: DEBUG oslo_concurrency.lockutils [req-14377388-2cd0-4151-9e1f-df69667b8b3e req-2dee930d-f0b9-4bac-a47a-6c6c0db9afc4 service nova] Acquiring lock "5f4991a3-c34b-45b1-a3af-94d7d990eef1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 701.414113] env[68617]: DEBUG oslo_concurrency.lockutils [req-14377388-2cd0-4151-9e1f-df69667b8b3e req-2dee930d-f0b9-4bac-a47a-6c6c0db9afc4 service nova] Lock "5f4991a3-c34b-45b1-a3af-94d7d990eef1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 701.414317] env[68617]: DEBUG oslo_concurrency.lockutils [req-14377388-2cd0-4151-9e1f-df69667b8b3e req-2dee930d-f0b9-4bac-a47a-6c6c0db9afc4 service nova] Lock "5f4991a3-c34b-45b1-a3af-94d7d990eef1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 701.414503] env[68617]: DEBUG nova.compute.manager [req-14377388-2cd0-4151-9e1f-df69667b8b3e req-2dee930d-f0b9-4bac-a47a-6c6c0db9afc4 service nova] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] No waiting events found dispatching network-vif-plugged-f91cc40a-05e2-40b3-9da5-5186487f847d {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 701.414666] env[68617]: WARNING nova.compute.manager [req-14377388-2cd0-4151-9e1f-df69667b8b3e req-2dee930d-f0b9-4bac-a47a-6c6c0db9afc4 service nova] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Received unexpected event network-vif-plugged-f91cc40a-05e2-40b3-9da5-5186487f847d for instance with vm_state building and task_state spawning. [ 701.470692] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-109f79f2-30ea-4684-82d3-c64acbe3ae6d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.478804] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-388278a6-9a58-40e3-b38e-5cf9fabeabca {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.513712] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed2a358e-c5a9-4d11-8432-a02230de1bb0 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.523725] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75256283-0032-43ab-b5e7-b0a221d549dc {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.540257] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 701.554837] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 701.585852] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 701.585852] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.457s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 701.634470] env[68617]: DEBUG nova.policy [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2c10f61e025e469890198c323de0578b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '828be21ced7d4d11a462ae49d04280ba', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 701.973035] env[68617]: DEBUG nova.network.neutron [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Updating instance_info_cache with network_info: [{"id": "6d992fe3-b0e7-4342-b724-e5fc24e07d7d", "address": "fa:16:3e:75:46:a0", "network": {"id": "baaa01e2-3cba-4bce-8ebe-04dd5ff2e4f0", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-360587695-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "30d430f92fc448d88707e4bcabd47d82", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dad4f433-bb0b-45c7-8040-972ef2277f75", "external-id": "nsx-vlan-transportzone-451", "segmentation_id": 451, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6d992fe3-b0", "ovs_interfaceid": "6d992fe3-b0e7-4342-b724-e5fc24e07d7d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 702.018020] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Releasing lock "refresh_cache-b95883b2-0366-4f52-bdf2-aa6259fafc58" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 702.018020] env[68617]: DEBUG nova.compute.manager [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Instance network_info: |[{"id": "6d992fe3-b0e7-4342-b724-e5fc24e07d7d", "address": "fa:16:3e:75:46:a0", "network": {"id": "baaa01e2-3cba-4bce-8ebe-04dd5ff2e4f0", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-360587695-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "30d430f92fc448d88707e4bcabd47d82", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dad4f433-bb0b-45c7-8040-972ef2277f75", "external-id": "nsx-vlan-transportzone-451", "segmentation_id": 451, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6d992fe3-b0", "ovs_interfaceid": "6d992fe3-b0e7-4342-b724-e5fc24e07d7d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 702.018446] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:75:46:a0', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'dad4f433-bb0b-45c7-8040-972ef2277f75', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6d992fe3-b0e7-4342-b724-e5fc24e07d7d', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 702.031913] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Creating folder: Project (30d430f92fc448d88707e4bcabd47d82). Parent ref: group-v693691. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 702.032140] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1b0d47d6-eca2-424c-8900-160089e8d9c9 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 702.044324] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Created folder: Project (30d430f92fc448d88707e4bcabd47d82) in parent group-v693691. [ 702.044324] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Creating folder: Instances. Parent ref: group-v693704. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 702.044324] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8964ca0d-ea06-4ac8-a31f-86ea7846a993 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 702.055553] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Created folder: Instances in parent group-v693704. [ 702.055553] env[68617]: DEBUG oslo.service.loopingcall [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 702.055553] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 702.055553] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-bcd1131e-1fad-4cf6-8359-8fa8c7150d8b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 702.087737] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 702.087737] env[68617]: value = "task-3470698" [ 702.087737] env[68617]: _type = "Task" [ 702.087737] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 702.099851] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470698, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 702.379897] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Acquiring lock "3b95678b-dfc5-4610-a51e-2ae12fbe274b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 702.380632] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Lock "3b95678b-dfc5-4610-a51e-2ae12fbe274b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 702.396517] env[68617]: DEBUG nova.compute.manager [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 702.481934] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 702.481934] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 702.483510] env[68617]: INFO nova.compute.claims [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 702.602565] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470698, 'name': CreateVM_Task, 'duration_secs': 0.307185} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 702.602620] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 702.603319] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 702.603552] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 702.603760] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 702.604700] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5685d36e-d9e1-4980-857f-af58f3f87acc {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 702.615106] env[68617]: DEBUG oslo_vmware.api [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Waiting for the task: (returnval){ [ 702.615106] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]522a7158-6207-2474-6982-5f68f2ffe450" [ 702.615106] env[68617]: _type = "Task" [ 702.615106] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 702.626091] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 702.627053] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 702.627053] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 702.746762] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4eac30af-7f32-4ae5-b07f-af11ab638b07 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 702.755696] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e57a3eb6-e2b8-4ec1-b662-6eb9ede0e1fc {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 702.796889] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c504e18f-5801-4124-8d83-bc4c459a414b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 702.806339] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15528b9b-6320-4129-8f32-4056738028e4 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 702.820284] env[68617]: DEBUG nova.compute.provider_tree [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 702.832789] env[68617]: DEBUG nova.scheduler.client.report [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 702.851147] env[68617]: DEBUG nova.network.neutron [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Successfully created port: 1d043def-bc38-40fd-85ba-95148f129598 {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 702.855077] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.373s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 702.855747] env[68617]: DEBUG nova.compute.manager [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 702.922141] env[68617]: DEBUG nova.compute.utils [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 702.925902] env[68617]: DEBUG nova.compute.manager [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Not allocating networking since 'none' was specified. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 702.948922] env[68617]: DEBUG nova.compute.manager [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 703.028676] env[68617]: DEBUG nova.compute.manager [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 703.070017] env[68617]: DEBUG nova.virt.hardware [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 703.072398] env[68617]: DEBUG nova.virt.hardware [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 703.073146] env[68617]: DEBUG nova.virt.hardware [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 703.073146] env[68617]: DEBUG nova.virt.hardware [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 703.073146] env[68617]: DEBUG nova.virt.hardware [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 703.073346] env[68617]: DEBUG nova.virt.hardware [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 703.073393] env[68617]: DEBUG nova.virt.hardware [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 703.073530] env[68617]: DEBUG nova.virt.hardware [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 703.074059] env[68617]: DEBUG nova.virt.hardware [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 703.074059] env[68617]: DEBUG nova.virt.hardware [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 703.074059] env[68617]: DEBUG nova.virt.hardware [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 703.074956] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55c0cf1e-0346-48d3-a9ff-0659116d6599 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.088147] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c6a5332-834d-4897-a62f-28a6d0a5c70e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.105653] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Instance VIF info [] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 703.112169] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Creating folder: Project (da442a9eb7cc47fca4f4ba5106bfedc5). Parent ref: group-v693691. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 703.113217] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-dec2f022-b8e5-4187-b1fd-4b5bdbf1e8e1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.125830] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Created folder: Project (da442a9eb7cc47fca4f4ba5106bfedc5) in parent group-v693691. [ 703.126102] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Creating folder: Instances. Parent ref: group-v693707. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 703.126377] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-43066aa6-4988-413a-8510-4a995d53eb0b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.137547] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Created folder: Instances in parent group-v693707. [ 703.137852] env[68617]: DEBUG oslo.service.loopingcall [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 703.138091] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 703.139488] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8e518afb-ef9a-4e52-a32d-12442aee846e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.157026] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 703.157026] env[68617]: value = "task-3470701" [ 703.157026] env[68617]: _type = "Task" [ 703.157026] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 703.164990] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470701, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 703.367201] env[68617]: DEBUG nova.network.neutron [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Successfully created port: 66c72d22-142a-4cfd-9b6f-4569f3c69765 {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 703.377900] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Acquiring lock "6300077d-5aa7-4794-8ba2-1ec30151c15c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 703.378147] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Lock "6300077d-5aa7-4794-8ba2-1ec30151c15c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 703.389801] env[68617]: DEBUG nova.compute.manager [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 703.945304] env[68617]: DEBUG nova.compute.manager [req-e1ccd4c8-70c7-4422-99bf-693a17783717 req-c19ac84a-6878-4c63-9c48-7179911a5380 service nova] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Received event network-vif-plugged-97d01f6d-8cc5-4382-9b4d-c38bef6bd469 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 703.946238] env[68617]: DEBUG oslo_concurrency.lockutils [req-e1ccd4c8-70c7-4422-99bf-693a17783717 req-c19ac84a-6878-4c63-9c48-7179911a5380 service nova] Acquiring lock "c507115c-92a0-4513-aae8-7dc8f95bc0ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 703.946238] env[68617]: DEBUG oslo_concurrency.lockutils [req-e1ccd4c8-70c7-4422-99bf-693a17783717 req-c19ac84a-6878-4c63-9c48-7179911a5380 service nova] Lock "c507115c-92a0-4513-aae8-7dc8f95bc0ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 703.946238] env[68617]: DEBUG oslo_concurrency.lockutils [req-e1ccd4c8-70c7-4422-99bf-693a17783717 req-c19ac84a-6878-4c63-9c48-7179911a5380 service nova] Lock "c507115c-92a0-4513-aae8-7dc8f95bc0ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 703.946238] env[68617]: DEBUG nova.compute.manager [req-e1ccd4c8-70c7-4422-99bf-693a17783717 req-c19ac84a-6878-4c63-9c48-7179911a5380 service nova] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] No waiting events found dispatching network-vif-plugged-97d01f6d-8cc5-4382-9b4d-c38bef6bd469 {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 703.946431] env[68617]: WARNING nova.compute.manager [req-e1ccd4c8-70c7-4422-99bf-693a17783717 req-c19ac84a-6878-4c63-9c48-7179911a5380 service nova] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Received unexpected event network-vif-plugged-97d01f6d-8cc5-4382-9b4d-c38bef6bd469 for instance with vm_state building and task_state spawning. [ 703.946431] env[68617]: DEBUG nova.compute.manager [req-e1ccd4c8-70c7-4422-99bf-693a17783717 req-c19ac84a-6878-4c63-9c48-7179911a5380 service nova] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Received event network-changed-52954f88-8663-4161-97b3-78be5e072d67 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 703.946576] env[68617]: DEBUG nova.compute.manager [req-e1ccd4c8-70c7-4422-99bf-693a17783717 req-c19ac84a-6878-4c63-9c48-7179911a5380 service nova] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Refreshing instance network info cache due to event network-changed-52954f88-8663-4161-97b3-78be5e072d67. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 703.946760] env[68617]: DEBUG oslo_concurrency.lockutils [req-e1ccd4c8-70c7-4422-99bf-693a17783717 req-c19ac84a-6878-4c63-9c48-7179911a5380 service nova] Acquiring lock "refresh_cache-b5707ff5-916e-49ce-9aac-9a08ac51bdf2" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 703.946893] env[68617]: DEBUG oslo_concurrency.lockutils [req-e1ccd4c8-70c7-4422-99bf-693a17783717 req-c19ac84a-6878-4c63-9c48-7179911a5380 service nova] Acquired lock "refresh_cache-b5707ff5-916e-49ce-9aac-9a08ac51bdf2" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 703.947058] env[68617]: DEBUG nova.network.neutron [req-e1ccd4c8-70c7-4422-99bf-693a17783717 req-c19ac84a-6878-4c63-9c48-7179911a5380 service nova] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Refreshing network info cache for port 52954f88-8663-4161-97b3-78be5e072d67 {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 703.962691] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470701, 'name': CreateVM_Task, 'duration_secs': 0.279856} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 703.963060] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 703.963947] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 703.963947] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 703.964082] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 703.964310] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7af1a805-fdf5-42bb-829d-a2ef2f761aa8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.969550] env[68617]: DEBUG oslo_vmware.api [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Waiting for the task: (returnval){ [ 703.969550] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]522547d0-056f-53ae-61ae-4363959c38ce" [ 703.969550] env[68617]: _type = "Task" [ 703.969550] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 703.978406] env[68617]: DEBUG oslo_vmware.api [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]522547d0-056f-53ae-61ae-4363959c38ce, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 703.984571] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 703.984802] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 703.987848] env[68617]: INFO nova.compute.claims [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 704.213849] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63276412-aa76-4885-86b0-73298b3afe87 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 704.224314] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6b55ab8-2634-4813-b791-b21dbc2503a9 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 704.268205] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e406c26d-da02-4f91-9415-c43de520e4c2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 704.277104] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9203a63a-a3e7-4942-9070-18505dc10338 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 704.293273] env[68617]: DEBUG nova.compute.provider_tree [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 704.311503] env[68617]: DEBUG nova.scheduler.client.report [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 704.346019] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.361s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 704.346582] env[68617]: DEBUG nova.compute.manager [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 704.413836] env[68617]: DEBUG nova.compute.utils [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 704.416455] env[68617]: DEBUG nova.compute.manager [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 704.416671] env[68617]: DEBUG nova.network.neutron [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 704.432693] env[68617]: DEBUG nova.compute.manager [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 704.485593] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 704.485847] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 704.486171] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 704.530718] env[68617]: DEBUG nova.compute.manager [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 704.576524] env[68617]: DEBUG nova.virt.hardware [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 704.576623] env[68617]: DEBUG nova.virt.hardware [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 704.577751] env[68617]: DEBUG nova.virt.hardware [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 704.577751] env[68617]: DEBUG nova.virt.hardware [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 704.580219] env[68617]: DEBUG nova.virt.hardware [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 704.580463] env[68617]: DEBUG nova.virt.hardware [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 704.580691] env[68617]: DEBUG nova.virt.hardware [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 704.580844] env[68617]: DEBUG nova.virt.hardware [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 704.581014] env[68617]: DEBUG nova.virt.hardware [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 704.581181] env[68617]: DEBUG nova.virt.hardware [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 704.581345] env[68617]: DEBUG nova.virt.hardware [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 704.582223] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85775993-06a0-47f0-b2ff-9dac5890f4e3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 704.591027] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0503204-863e-4da6-9d89-db8747737f34 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 704.859777] env[68617]: DEBUG nova.policy [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66e5dd7985e14876a2a3c68a0e93e489', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1fb6f6ef0b1f47f482b359aa265cb6a7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 705.419170] env[68617]: DEBUG nova.network.neutron [req-e1ccd4c8-70c7-4422-99bf-693a17783717 req-c19ac84a-6878-4c63-9c48-7179911a5380 service nova] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Updated VIF entry in instance network info cache for port 52954f88-8663-4161-97b3-78be5e072d67. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 705.420012] env[68617]: DEBUG nova.network.neutron [req-e1ccd4c8-70c7-4422-99bf-693a17783717 req-c19ac84a-6878-4c63-9c48-7179911a5380 service nova] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Updating instance_info_cache with network_info: [{"id": "52954f88-8663-4161-97b3-78be5e072d67", "address": "fa:16:3e:6a:59:10", "network": {"id": "064d41fb-a622-47db-b457-8854c04c637e", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1693772355-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "10e0bbefce664b53ab6dff0effeb96ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f39e3b37-7906-4bbc-820e-ceac74e4d827", "external-id": "nsx-vlan-transportzone-328", "segmentation_id": 328, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap52954f88-86", "ovs_interfaceid": "52954f88-8663-4161-97b3-78be5e072d67", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 705.438266] env[68617]: DEBUG oslo_concurrency.lockutils [req-e1ccd4c8-70c7-4422-99bf-693a17783717 req-c19ac84a-6878-4c63-9c48-7179911a5380 service nova] Releasing lock "refresh_cache-b5707ff5-916e-49ce-9aac-9a08ac51bdf2" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 705.438563] env[68617]: DEBUG nova.compute.manager [req-e1ccd4c8-70c7-4422-99bf-693a17783717 req-c19ac84a-6878-4c63-9c48-7179911a5380 service nova] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Received event network-changed-97d01f6d-8cc5-4382-9b4d-c38bef6bd469 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 705.438733] env[68617]: DEBUG nova.compute.manager [req-e1ccd4c8-70c7-4422-99bf-693a17783717 req-c19ac84a-6878-4c63-9c48-7179911a5380 service nova] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Refreshing instance network info cache due to event network-changed-97d01f6d-8cc5-4382-9b4d-c38bef6bd469. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 705.438935] env[68617]: DEBUG oslo_concurrency.lockutils [req-e1ccd4c8-70c7-4422-99bf-693a17783717 req-c19ac84a-6878-4c63-9c48-7179911a5380 service nova] Acquiring lock "refresh_cache-c507115c-92a0-4513-aae8-7dc8f95bc0ea" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 705.439103] env[68617]: DEBUG oslo_concurrency.lockutils [req-e1ccd4c8-70c7-4422-99bf-693a17783717 req-c19ac84a-6878-4c63-9c48-7179911a5380 service nova] Acquired lock "refresh_cache-c507115c-92a0-4513-aae8-7dc8f95bc0ea" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 705.439263] env[68617]: DEBUG nova.network.neutron [req-e1ccd4c8-70c7-4422-99bf-693a17783717 req-c19ac84a-6878-4c63-9c48-7179911a5380 service nova] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Refreshing network info cache for port 97d01f6d-8cc5-4382-9b4d-c38bef6bd469 {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 706.229656] env[68617]: DEBUG nova.network.neutron [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Successfully created port: 3cce742a-a0fb-42cc-b3fa-ac15c3c48765 {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 706.255150] env[68617]: DEBUG nova.network.neutron [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Successfully updated port: 66c72d22-142a-4cfd-9b6f-4569f3c69765 {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 706.268305] env[68617]: DEBUG oslo_concurrency.lockutils [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquiring lock "refresh_cache-f13242a0-7e65-4d68-a317-16fb8c4b8f8a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 706.268507] env[68617]: DEBUG oslo_concurrency.lockutils [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquired lock "refresh_cache-f13242a0-7e65-4d68-a317-16fb8c4b8f8a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 706.268661] env[68617]: DEBUG nova.network.neutron [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 706.355231] env[68617]: DEBUG nova.network.neutron [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 706.568658] env[68617]: DEBUG nova.network.neutron [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Successfully updated port: 1d043def-bc38-40fd-85ba-95148f129598 {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 706.580000] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Acquiring lock "refresh_cache-050e2b27-1311-4a9a-b5cf-6bc2f7128eba" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 706.581890] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Acquired lock "refresh_cache-050e2b27-1311-4a9a-b5cf-6bc2f7128eba" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 706.581890] env[68617]: DEBUG nova.network.neutron [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 706.742553] env[68617]: DEBUG nova.network.neutron [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 706.866160] env[68617]: DEBUG nova.network.neutron [req-e1ccd4c8-70c7-4422-99bf-693a17783717 req-c19ac84a-6878-4c63-9c48-7179911a5380 service nova] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Updated VIF entry in instance network info cache for port 97d01f6d-8cc5-4382-9b4d-c38bef6bd469. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 706.866585] env[68617]: DEBUG nova.network.neutron [req-e1ccd4c8-70c7-4422-99bf-693a17783717 req-c19ac84a-6878-4c63-9c48-7179911a5380 service nova] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Updating instance_info_cache with network_info: [{"id": "97d01f6d-8cc5-4382-9b4d-c38bef6bd469", "address": "fa:16:3e:17:12:9f", "network": {"id": "e3aee9db-8596-4ea8-943e-5c365382ee22", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.155", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "f1a3ab6230dd468b8019424ce71de8ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cde23701-02ca-4cb4-b5a6-d321f8ac9660", "external-id": "nsx-vlan-transportzone-586", "segmentation_id": 586, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap97d01f6d-8c", "ovs_interfaceid": "97d01f6d-8cc5-4382-9b4d-c38bef6bd469", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 706.883488] env[68617]: DEBUG oslo_concurrency.lockutils [req-e1ccd4c8-70c7-4422-99bf-693a17783717 req-c19ac84a-6878-4c63-9c48-7179911a5380 service nova] Releasing lock "refresh_cache-c507115c-92a0-4513-aae8-7dc8f95bc0ea" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 706.917244] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquiring lock "4ea5887f-84bd-4629-b568-e73c78af0ad4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 706.917244] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Lock "4ea5887f-84bd-4629-b568-e73c78af0ad4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 706.931403] env[68617]: DEBUG nova.compute.manager [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 707.006699] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 707.008844] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 707.012666] env[68617]: INFO nova.compute.claims [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 707.210724] env[68617]: DEBUG nova.network.neutron [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Updating instance_info_cache with network_info: [{"id": "66c72d22-142a-4cfd-9b6f-4569f3c69765", "address": "fa:16:3e:05:1e:b9", "network": {"id": "0f2e6893-43e2-458a-8326-dd03f1a6b1a7", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2034507765-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "828be21ced7d4d11a462ae49d04280ba", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "49b5df12-d801-4140-8816-2fd401608c7d", "external-id": "nsx-vlan-transportzone-326", "segmentation_id": 326, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap66c72d22-14", "ovs_interfaceid": "66c72d22-142a-4cfd-9b6f-4569f3c69765", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 707.235045] env[68617]: DEBUG oslo_concurrency.lockutils [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Releasing lock "refresh_cache-f13242a0-7e65-4d68-a317-16fb8c4b8f8a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 707.235333] env[68617]: DEBUG nova.compute.manager [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Instance network_info: |[{"id": "66c72d22-142a-4cfd-9b6f-4569f3c69765", "address": "fa:16:3e:05:1e:b9", "network": {"id": "0f2e6893-43e2-458a-8326-dd03f1a6b1a7", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2034507765-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "828be21ced7d4d11a462ae49d04280ba", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "49b5df12-d801-4140-8816-2fd401608c7d", "external-id": "nsx-vlan-transportzone-326", "segmentation_id": 326, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap66c72d22-14", "ovs_interfaceid": "66c72d22-142a-4cfd-9b6f-4569f3c69765", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 707.235955] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:05:1e:b9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '49b5df12-d801-4140-8816-2fd401608c7d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '66c72d22-142a-4cfd-9b6f-4569f3c69765', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 707.245132] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Creating folder: Project (828be21ced7d4d11a462ae49d04280ba). Parent ref: group-v693691. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 707.246053] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-af04fa56-b69d-453d-852b-f6fc4c36fd36 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.259227] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Created folder: Project (828be21ced7d4d11a462ae49d04280ba) in parent group-v693691. [ 707.259227] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Creating folder: Instances. Parent ref: group-v693710. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 707.259369] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b32576e3-a731-4241-8920-89c54867939e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.264779] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6c47fe0-dd18-4b4d-9128-3b039466e786 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.269441] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Created folder: Instances in parent group-v693710. [ 707.269708] env[68617]: DEBUG oslo.service.loopingcall [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 707.269985] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 707.270112] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6b30d4a2-8128-4288-90a4-b3e79cc0c165 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.297778] env[68617]: DEBUG oslo_concurrency.lockutils [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "6eef6e24-cf49-458b-ae37-8da4e02045f8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 707.298216] env[68617]: DEBUG oslo_concurrency.lockutils [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "6eef6e24-cf49-458b-ae37-8da4e02045f8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 707.299481] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7cba5af-576b-4a35-a147-9b3f327fecd8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.306306] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 707.306306] env[68617]: value = "task-3470704" [ 707.306306] env[68617]: _type = "Task" [ 707.306306] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 707.343844] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a38c259-14f6-4563-a9cc-8a643fa32606 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.350732] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470704, 'name': CreateVM_Task} progress is 25%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 707.356642] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9e975d2-df3f-4e39-a8cf-a31cfe29a2b3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.371244] env[68617]: DEBUG nova.compute.provider_tree [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 707.386011] env[68617]: DEBUG nova.scheduler.client.report [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 707.403746] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.396s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 707.403905] env[68617]: DEBUG nova.compute.manager [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 707.462424] env[68617]: DEBUG nova.compute.utils [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 707.464045] env[68617]: DEBUG nova.compute.manager [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 707.464350] env[68617]: DEBUG nova.network.neutron [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 707.481224] env[68617]: DEBUG nova.compute.manager [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 707.558236] env[68617]: DEBUG nova.compute.manager [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 707.593710] env[68617]: DEBUG nova.virt.hardware [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 707.594081] env[68617]: DEBUG nova.virt.hardware [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 707.594326] env[68617]: DEBUG nova.virt.hardware [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 707.594326] env[68617]: DEBUG nova.virt.hardware [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 707.594469] env[68617]: DEBUG nova.virt.hardware [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 707.594633] env[68617]: DEBUG nova.virt.hardware [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 707.594864] env[68617]: DEBUG nova.virt.hardware [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 707.595070] env[68617]: DEBUG nova.virt.hardware [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 707.595271] env[68617]: DEBUG nova.virt.hardware [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 707.595481] env[68617]: DEBUG nova.virt.hardware [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 707.595605] env[68617]: DEBUG nova.virt.hardware [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 707.596874] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c89940d-37b2-46db-92be-bacd580253ef {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.605440] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df1602a6-6d48-4ec7-87b2-c2e91dfc7e3f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.650850] env[68617]: DEBUG nova.policy [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2c10f61e025e469890198c323de0578b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '828be21ced7d4d11a462ae49d04280ba', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 707.738037] env[68617]: DEBUG nova.network.neutron [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Updating instance_info_cache with network_info: [{"id": "1d043def-bc38-40fd-85ba-95148f129598", "address": "fa:16:3e:61:7b:9d", "network": {"id": "e3aee9db-8596-4ea8-943e-5c365382ee22", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.89", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "f1a3ab6230dd468b8019424ce71de8ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cde23701-02ca-4cb4-b5a6-d321f8ac9660", "external-id": "nsx-vlan-transportzone-586", "segmentation_id": 586, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1d043def-bc", "ovs_interfaceid": "1d043def-bc38-40fd-85ba-95148f129598", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 707.750671] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Releasing lock "refresh_cache-050e2b27-1311-4a9a-b5cf-6bc2f7128eba" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 707.750987] env[68617]: DEBUG nova.compute.manager [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Instance network_info: |[{"id": "1d043def-bc38-40fd-85ba-95148f129598", "address": "fa:16:3e:61:7b:9d", "network": {"id": "e3aee9db-8596-4ea8-943e-5c365382ee22", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.89", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "f1a3ab6230dd468b8019424ce71de8ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cde23701-02ca-4cb4-b5a6-d321f8ac9660", "external-id": "nsx-vlan-transportzone-586", "segmentation_id": 586, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1d043def-bc", "ovs_interfaceid": "1d043def-bc38-40fd-85ba-95148f129598", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 707.751371] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:61:7b:9d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'cde23701-02ca-4cb4-b5a6-d321f8ac9660', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1d043def-bc38-40fd-85ba-95148f129598', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 707.759437] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Creating folder: Project (11dbafd8e6f143718a82d40b45d1e021). Parent ref: group-v693691. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 707.760041] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5b39f9d7-8e00-458e-8800-4a3896d4b4ea {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.772022] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Created folder: Project (11dbafd8e6f143718a82d40b45d1e021) in parent group-v693691. [ 707.772022] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Creating folder: Instances. Parent ref: group-v693713. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 707.772022] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-65e303f6-db1d-4427-b620-b54c29cbd758 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.778117] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Created folder: Instances in parent group-v693713. [ 707.778351] env[68617]: DEBUG oslo.service.loopingcall [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 707.778525] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 707.778717] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-33b71a7a-189c-4590-9fcb-acb1626694ab {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.799234] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 707.799234] env[68617]: value = "task-3470707" [ 707.799234] env[68617]: _type = "Task" [ 707.799234] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 707.806589] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470707, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 707.819607] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470704, 'name': CreateVM_Task, 'duration_secs': 0.302284} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 707.819914] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 707.820679] env[68617]: DEBUG oslo_concurrency.lockutils [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 707.821624] env[68617]: DEBUG oslo_concurrency.lockutils [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 707.822069] env[68617]: DEBUG oslo_concurrency.lockutils [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 707.822427] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-87c4c69a-3ea0-4e35-9470-6fbae9d6a653 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.828391] env[68617]: DEBUG oslo_vmware.api [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Waiting for the task: (returnval){ [ 707.828391] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]5220ecf1-7baf-3be8-a5f9-bfb6a13a57de" [ 707.828391] env[68617]: _type = "Task" [ 707.828391] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 707.838471] env[68617]: DEBUG oslo_vmware.api [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]5220ecf1-7baf-3be8-a5f9-bfb6a13a57de, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 708.080309] env[68617]: DEBUG nova.compute.manager [req-0d514fff-5916-4ff9-b376-e88c374e656c req-1a1a4764-9389-4586-ba7a-4a6eb487600a service nova] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Received event network-changed-f91cc40a-05e2-40b3-9da5-5186487f847d {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 708.081019] env[68617]: DEBUG nova.compute.manager [req-0d514fff-5916-4ff9-b376-e88c374e656c req-1a1a4764-9389-4586-ba7a-4a6eb487600a service nova] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Refreshing instance network info cache due to event network-changed-f91cc40a-05e2-40b3-9da5-5186487f847d. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 708.081019] env[68617]: DEBUG oslo_concurrency.lockutils [req-0d514fff-5916-4ff9-b376-e88c374e656c req-1a1a4764-9389-4586-ba7a-4a6eb487600a service nova] Acquiring lock "refresh_cache-5f4991a3-c34b-45b1-a3af-94d7d990eef1" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 708.081019] env[68617]: DEBUG oslo_concurrency.lockutils [req-0d514fff-5916-4ff9-b376-e88c374e656c req-1a1a4764-9389-4586-ba7a-4a6eb487600a service nova] Acquired lock "refresh_cache-5f4991a3-c34b-45b1-a3af-94d7d990eef1" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 708.081919] env[68617]: DEBUG nova.network.neutron [req-0d514fff-5916-4ff9-b376-e88c374e656c req-1a1a4764-9389-4586-ba7a-4a6eb487600a service nova] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Refreshing network info cache for port f91cc40a-05e2-40b3-9da5-5186487f847d {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 708.311383] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470707, 'name': CreateVM_Task, 'duration_secs': 0.359315} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 708.311727] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 708.312737] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 708.341227] env[68617]: DEBUG oslo_concurrency.lockutils [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 708.341565] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 708.341827] env[68617]: DEBUG oslo_concurrency.lockutils [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 708.342089] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 708.342428] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 708.344610] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b026d6fa-6454-4710-be1c-cc6a71ef45e2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 708.353022] env[68617]: DEBUG oslo_vmware.api [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Waiting for the task: (returnval){ [ 708.353022] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]523354a1-253f-0316-e72d-7601fb5939ed" [ 708.353022] env[68617]: _type = "Task" [ 708.353022] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 708.365400] env[68617]: DEBUG oslo_vmware.api [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]523354a1-253f-0316-e72d-7601fb5939ed, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 708.866717] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 708.866717] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 708.867089] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 708.920764] env[68617]: DEBUG nova.network.neutron [req-0d514fff-5916-4ff9-b376-e88c374e656c req-1a1a4764-9389-4586-ba7a-4a6eb487600a service nova] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Updated VIF entry in instance network info cache for port f91cc40a-05e2-40b3-9da5-5186487f847d. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 708.921188] env[68617]: DEBUG nova.network.neutron [req-0d514fff-5916-4ff9-b376-e88c374e656c req-1a1a4764-9389-4586-ba7a-4a6eb487600a service nova] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Updating instance_info_cache with network_info: [{"id": "f91cc40a-05e2-40b3-9da5-5186487f847d", "address": "fa:16:3e:a9:07:39", "network": {"id": "cc1d083d-f23c-4bc6-9c94-f9271465e167", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1147907742-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "20cf3ab18e0b4e8d89ae53ed3b01abfc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d4ef133-b6f3-41d1-add4-92a1482195cf", "external-id": "nsx-vlan-transportzone-446", "segmentation_id": 446, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf91cc40a-05", "ovs_interfaceid": "f91cc40a-05e2-40b3-9da5-5186487f847d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 708.950412] env[68617]: DEBUG oslo_concurrency.lockutils [req-0d514fff-5916-4ff9-b376-e88c374e656c req-1a1a4764-9389-4586-ba7a-4a6eb487600a service nova] Releasing lock "refresh_cache-5f4991a3-c34b-45b1-a3af-94d7d990eef1" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 708.950500] env[68617]: DEBUG nova.compute.manager [req-0d514fff-5916-4ff9-b376-e88c374e656c req-1a1a4764-9389-4586-ba7a-4a6eb487600a service nova] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Received event network-vif-plugged-6d992fe3-b0e7-4342-b724-e5fc24e07d7d {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 708.950662] env[68617]: DEBUG oslo_concurrency.lockutils [req-0d514fff-5916-4ff9-b376-e88c374e656c req-1a1a4764-9389-4586-ba7a-4a6eb487600a service nova] Acquiring lock "b95883b2-0366-4f52-bdf2-aa6259fafc58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 708.950851] env[68617]: DEBUG oslo_concurrency.lockutils [req-0d514fff-5916-4ff9-b376-e88c374e656c req-1a1a4764-9389-4586-ba7a-4a6eb487600a service nova] Lock "b95883b2-0366-4f52-bdf2-aa6259fafc58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 708.952165] env[68617]: DEBUG oslo_concurrency.lockutils [req-0d514fff-5916-4ff9-b376-e88c374e656c req-1a1a4764-9389-4586-ba7a-4a6eb487600a service nova] Lock "b95883b2-0366-4f52-bdf2-aa6259fafc58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 708.952165] env[68617]: DEBUG nova.compute.manager [req-0d514fff-5916-4ff9-b376-e88c374e656c req-1a1a4764-9389-4586-ba7a-4a6eb487600a service nova] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] No waiting events found dispatching network-vif-plugged-6d992fe3-b0e7-4342-b724-e5fc24e07d7d {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 708.952165] env[68617]: WARNING nova.compute.manager [req-0d514fff-5916-4ff9-b376-e88c374e656c req-1a1a4764-9389-4586-ba7a-4a6eb487600a service nova] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Received unexpected event network-vif-plugged-6d992fe3-b0e7-4342-b724-e5fc24e07d7d for instance with vm_state building and task_state spawning. [ 708.952165] env[68617]: DEBUG nova.compute.manager [req-0d514fff-5916-4ff9-b376-e88c374e656c req-1a1a4764-9389-4586-ba7a-4a6eb487600a service nova] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Received event network-changed-6d992fe3-b0e7-4342-b724-e5fc24e07d7d {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 708.952718] env[68617]: DEBUG nova.compute.manager [req-0d514fff-5916-4ff9-b376-e88c374e656c req-1a1a4764-9389-4586-ba7a-4a6eb487600a service nova] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Refreshing instance network info cache due to event network-changed-6d992fe3-b0e7-4342-b724-e5fc24e07d7d. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 708.952718] env[68617]: DEBUG oslo_concurrency.lockutils [req-0d514fff-5916-4ff9-b376-e88c374e656c req-1a1a4764-9389-4586-ba7a-4a6eb487600a service nova] Acquiring lock "refresh_cache-b95883b2-0366-4f52-bdf2-aa6259fafc58" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 708.952718] env[68617]: DEBUG oslo_concurrency.lockutils [req-0d514fff-5916-4ff9-b376-e88c374e656c req-1a1a4764-9389-4586-ba7a-4a6eb487600a service nova] Acquired lock "refresh_cache-b95883b2-0366-4f52-bdf2-aa6259fafc58" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 708.952718] env[68617]: DEBUG nova.network.neutron [req-0d514fff-5916-4ff9-b376-e88c374e656c req-1a1a4764-9389-4586-ba7a-4a6eb487600a service nova] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Refreshing network info cache for port 6d992fe3-b0e7-4342-b724-e5fc24e07d7d {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 708.997657] env[68617]: DEBUG nova.network.neutron [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Successfully created port: 106b3564-fd1a-4802-8825-cec0ac29771e {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 709.099958] env[68617]: DEBUG nova.network.neutron [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Successfully updated port: 3cce742a-a0fb-42cc-b3fa-ac15c3c48765 {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 709.115032] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Acquiring lock "refresh_cache-6300077d-5aa7-4794-8ba2-1ec30151c15c" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 709.115131] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Acquired lock "refresh_cache-6300077d-5aa7-4794-8ba2-1ec30151c15c" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 709.115316] env[68617]: DEBUG nova.network.neutron [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 709.261142] env[68617]: DEBUG nova.network.neutron [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 710.066829] env[68617]: DEBUG nova.network.neutron [req-0d514fff-5916-4ff9-b376-e88c374e656c req-1a1a4764-9389-4586-ba7a-4a6eb487600a service nova] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Updated VIF entry in instance network info cache for port 6d992fe3-b0e7-4342-b724-e5fc24e07d7d. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 710.067206] env[68617]: DEBUG nova.network.neutron [req-0d514fff-5916-4ff9-b376-e88c374e656c req-1a1a4764-9389-4586-ba7a-4a6eb487600a service nova] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Updating instance_info_cache with network_info: [{"id": "6d992fe3-b0e7-4342-b724-e5fc24e07d7d", "address": "fa:16:3e:75:46:a0", "network": {"id": "baaa01e2-3cba-4bce-8ebe-04dd5ff2e4f0", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-360587695-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "30d430f92fc448d88707e4bcabd47d82", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dad4f433-bb0b-45c7-8040-972ef2277f75", "external-id": "nsx-vlan-transportzone-451", "segmentation_id": 451, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6d992fe3-b0", "ovs_interfaceid": "6d992fe3-b0e7-4342-b724-e5fc24e07d7d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 710.087971] env[68617]: DEBUG oslo_concurrency.lockutils [req-0d514fff-5916-4ff9-b376-e88c374e656c req-1a1a4764-9389-4586-ba7a-4a6eb487600a service nova] Releasing lock "refresh_cache-b95883b2-0366-4f52-bdf2-aa6259fafc58" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 710.319132] env[68617]: DEBUG nova.network.neutron [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Updating instance_info_cache with network_info: [{"id": "3cce742a-a0fb-42cc-b3fa-ac15c3c48765", "address": "fa:16:3e:8d:23:bd", "network": {"id": "e3aee9db-8596-4ea8-943e-5c365382ee22", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.239", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "f1a3ab6230dd468b8019424ce71de8ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cde23701-02ca-4cb4-b5a6-d321f8ac9660", "external-id": "nsx-vlan-transportzone-586", "segmentation_id": 586, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3cce742a-a0", "ovs_interfaceid": "3cce742a-a0fb-42cc-b3fa-ac15c3c48765", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 710.344244] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Releasing lock "refresh_cache-6300077d-5aa7-4794-8ba2-1ec30151c15c" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 710.344244] env[68617]: DEBUG nova.compute.manager [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Instance network_info: |[{"id": "3cce742a-a0fb-42cc-b3fa-ac15c3c48765", "address": "fa:16:3e:8d:23:bd", "network": {"id": "e3aee9db-8596-4ea8-943e-5c365382ee22", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.239", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "f1a3ab6230dd468b8019424ce71de8ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cde23701-02ca-4cb4-b5a6-d321f8ac9660", "external-id": "nsx-vlan-transportzone-586", "segmentation_id": 586, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3cce742a-a0", "ovs_interfaceid": "3cce742a-a0fb-42cc-b3fa-ac15c3c48765", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 710.344493] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:8d:23:bd', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'cde23701-02ca-4cb4-b5a6-d321f8ac9660', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3cce742a-a0fb-42cc-b3fa-ac15c3c48765', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 710.359531] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Creating folder: Project (1fb6f6ef0b1f47f482b359aa265cb6a7). Parent ref: group-v693691. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 710.360575] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-456513f7-39c4-4940-b7d1-591239b36e12 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 710.364597] env[68617]: DEBUG nova.compute.manager [req-f5822f15-65af-46ed-a2ca-ce77956083f1 req-aa5cf6d2-cfa1-450f-ba9a-eb13f677dd7a service nova] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Received event network-vif-plugged-1d043def-bc38-40fd-85ba-95148f129598 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 710.365174] env[68617]: DEBUG oslo_concurrency.lockutils [req-f5822f15-65af-46ed-a2ca-ce77956083f1 req-aa5cf6d2-cfa1-450f-ba9a-eb13f677dd7a service nova] Acquiring lock "050e2b27-1311-4a9a-b5cf-6bc2f7128eba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 710.365174] env[68617]: DEBUG oslo_concurrency.lockutils [req-f5822f15-65af-46ed-a2ca-ce77956083f1 req-aa5cf6d2-cfa1-450f-ba9a-eb13f677dd7a service nova] Lock "050e2b27-1311-4a9a-b5cf-6bc2f7128eba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 710.365285] env[68617]: DEBUG oslo_concurrency.lockutils [req-f5822f15-65af-46ed-a2ca-ce77956083f1 req-aa5cf6d2-cfa1-450f-ba9a-eb13f677dd7a service nova] Lock "050e2b27-1311-4a9a-b5cf-6bc2f7128eba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 710.365463] env[68617]: DEBUG nova.compute.manager [req-f5822f15-65af-46ed-a2ca-ce77956083f1 req-aa5cf6d2-cfa1-450f-ba9a-eb13f677dd7a service nova] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] No waiting events found dispatching network-vif-plugged-1d043def-bc38-40fd-85ba-95148f129598 {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 710.365665] env[68617]: WARNING nova.compute.manager [req-f5822f15-65af-46ed-a2ca-ce77956083f1 req-aa5cf6d2-cfa1-450f-ba9a-eb13f677dd7a service nova] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Received unexpected event network-vif-plugged-1d043def-bc38-40fd-85ba-95148f129598 for instance with vm_state building and task_state spawning. [ 710.377162] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Created folder: Project (1fb6f6ef0b1f47f482b359aa265cb6a7) in parent group-v693691. [ 710.377162] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Creating folder: Instances. Parent ref: group-v693716. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 710.377162] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c8a25644-802c-4c2c-9f81-e1906bebf2b4 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 710.386996] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Created folder: Instances in parent group-v693716. [ 710.389148] env[68617]: DEBUG oslo.service.loopingcall [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 710.389148] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 710.389148] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5705726f-0d40-4a09-9a72-ecfe8e2d50d3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 710.418280] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 710.418280] env[68617]: value = "task-3470710" [ 710.418280] env[68617]: _type = "Task" [ 710.418280] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 710.428376] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470710, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 710.551930] env[68617]: DEBUG nova.compute.manager [req-c4a37b22-47f5-4009-876f-8c04fb15b65c req-92a21ed2-b6d5-4242-9855-f3885e408c39 service nova] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Received event network-vif-plugged-66c72d22-142a-4cfd-9b6f-4569f3c69765 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 710.552379] env[68617]: DEBUG oslo_concurrency.lockutils [req-c4a37b22-47f5-4009-876f-8c04fb15b65c req-92a21ed2-b6d5-4242-9855-f3885e408c39 service nova] Acquiring lock "f13242a0-7e65-4d68-a317-16fb8c4b8f8a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 710.552535] env[68617]: DEBUG oslo_concurrency.lockutils [req-c4a37b22-47f5-4009-876f-8c04fb15b65c req-92a21ed2-b6d5-4242-9855-f3885e408c39 service nova] Lock "f13242a0-7e65-4d68-a317-16fb8c4b8f8a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 710.552727] env[68617]: DEBUG oslo_concurrency.lockutils [req-c4a37b22-47f5-4009-876f-8c04fb15b65c req-92a21ed2-b6d5-4242-9855-f3885e408c39 service nova] Lock "f13242a0-7e65-4d68-a317-16fb8c4b8f8a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 710.553060] env[68617]: DEBUG nova.compute.manager [req-c4a37b22-47f5-4009-876f-8c04fb15b65c req-92a21ed2-b6d5-4242-9855-f3885e408c39 service nova] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] No waiting events found dispatching network-vif-plugged-66c72d22-142a-4cfd-9b6f-4569f3c69765 {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 710.553188] env[68617]: WARNING nova.compute.manager [req-c4a37b22-47f5-4009-876f-8c04fb15b65c req-92a21ed2-b6d5-4242-9855-f3885e408c39 service nova] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Received unexpected event network-vif-plugged-66c72d22-142a-4cfd-9b6f-4569f3c69765 for instance with vm_state building and task_state spawning. [ 710.605764] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Acquiring lock "71b1ebba-2019-4378-9bd2-98a7559c22e8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 710.606246] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Lock "71b1ebba-2019-4378-9bd2-98a7559c22e8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 710.935930] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470710, 'name': CreateVM_Task, 'duration_secs': 0.333277} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 710.937269] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 710.938137] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 710.938137] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 710.938359] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 710.938611] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8feef3bd-1d83-45de-bcec-8540399e6bcd {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 710.945508] env[68617]: DEBUG oslo_vmware.api [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Waiting for the task: (returnval){ [ 710.945508] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52da40a8-74a0-cbcf-fe33-7dd525cdfa18" [ 710.945508] env[68617]: _type = "Task" [ 710.945508] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 710.954949] env[68617]: DEBUG oslo_vmware.api [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52da40a8-74a0-cbcf-fe33-7dd525cdfa18, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 711.459233] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 711.459233] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 711.459233] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 711.796316] env[68617]: DEBUG nova.network.neutron [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Successfully updated port: 106b3564-fd1a-4802-8825-cec0ac29771e {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 711.817766] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquiring lock "refresh_cache-4ea5887f-84bd-4629-b568-e73c78af0ad4" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 711.820931] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquired lock "refresh_cache-4ea5887f-84bd-4629-b568-e73c78af0ad4" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 711.821154] env[68617]: DEBUG nova.network.neutron [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 711.916008] env[68617]: DEBUG oslo_concurrency.lockutils [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Acquiring lock "e6b6cbdd-11d6-44a6-8da7-98e0f52cef67" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 711.916236] env[68617]: DEBUG oslo_concurrency.lockutils [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Lock "e6b6cbdd-11d6-44a6-8da7-98e0f52cef67" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 711.959264] env[68617]: DEBUG nova.network.neutron [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 712.578540] env[68617]: DEBUG oslo_concurrency.lockutils [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquiring lock "b27ace75-e2fa-4acc-96cb-88dd49b89de5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 712.579346] env[68617]: DEBUG oslo_concurrency.lockutils [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Lock "b27ace75-e2fa-4acc-96cb-88dd49b89de5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 712.625352] env[68617]: DEBUG nova.network.neutron [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Updating instance_info_cache with network_info: [{"id": "106b3564-fd1a-4802-8825-cec0ac29771e", "address": "fa:16:3e:d3:1f:a0", "network": {"id": "0f2e6893-43e2-458a-8326-dd03f1a6b1a7", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2034507765-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "828be21ced7d4d11a462ae49d04280ba", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "49b5df12-d801-4140-8816-2fd401608c7d", "external-id": "nsx-vlan-transportzone-326", "segmentation_id": 326, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap106b3564-fd", "ovs_interfaceid": "106b3564-fd1a-4802-8825-cec0ac29771e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 712.643111] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Releasing lock "refresh_cache-4ea5887f-84bd-4629-b568-e73c78af0ad4" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 712.643111] env[68617]: DEBUG nova.compute.manager [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Instance network_info: |[{"id": "106b3564-fd1a-4802-8825-cec0ac29771e", "address": "fa:16:3e:d3:1f:a0", "network": {"id": "0f2e6893-43e2-458a-8326-dd03f1a6b1a7", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2034507765-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "828be21ced7d4d11a462ae49d04280ba", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "49b5df12-d801-4140-8816-2fd401608c7d", "external-id": "nsx-vlan-transportzone-326", "segmentation_id": 326, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap106b3564-fd", "ovs_interfaceid": "106b3564-fd1a-4802-8825-cec0ac29771e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 712.643466] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d3:1f:a0', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '49b5df12-d801-4140-8816-2fd401608c7d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '106b3564-fd1a-4802-8825-cec0ac29771e', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 712.651309] env[68617]: DEBUG oslo.service.loopingcall [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 712.652361] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 712.652598] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-7af71a95-4dea-49c8-a244-56b47cbd639d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.674063] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 712.674063] env[68617]: value = "task-3470711" [ 712.674063] env[68617]: _type = "Task" [ 712.674063] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 712.682457] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470711, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 713.185105] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470711, 'name': CreateVM_Task, 'duration_secs': 0.330337} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 713.185337] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 713.186829] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 713.186829] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 713.187482] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 713.187482] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0695274a-b2c4-4ebb-b451-27cfa10b9fe5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 713.192752] env[68617]: DEBUG oslo_vmware.api [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Waiting for the task: (returnval){ [ 713.192752] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52a39bdd-457d-a92d-db2f-22dc21799167" [ 713.192752] env[68617]: _type = "Task" [ 713.192752] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 713.204201] env[68617]: DEBUG oslo_vmware.api [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52a39bdd-457d-a92d-db2f-22dc21799167, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 713.709655] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 713.709655] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 713.709655] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 713.734151] env[68617]: DEBUG nova.compute.manager [req-b90207b4-c37e-4c35-8e37-1a89b3204e86 req-5162e7f7-107f-4158-a931-3d4a2b9d2f39 service nova] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Received event network-changed-1d043def-bc38-40fd-85ba-95148f129598 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 713.734480] env[68617]: DEBUG nova.compute.manager [req-b90207b4-c37e-4c35-8e37-1a89b3204e86 req-5162e7f7-107f-4158-a931-3d4a2b9d2f39 service nova] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Refreshing instance network info cache due to event network-changed-1d043def-bc38-40fd-85ba-95148f129598. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 713.734620] env[68617]: DEBUG oslo_concurrency.lockutils [req-b90207b4-c37e-4c35-8e37-1a89b3204e86 req-5162e7f7-107f-4158-a931-3d4a2b9d2f39 service nova] Acquiring lock "refresh_cache-050e2b27-1311-4a9a-b5cf-6bc2f7128eba" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 713.734756] env[68617]: DEBUG oslo_concurrency.lockutils [req-b90207b4-c37e-4c35-8e37-1a89b3204e86 req-5162e7f7-107f-4158-a931-3d4a2b9d2f39 service nova] Acquired lock "refresh_cache-050e2b27-1311-4a9a-b5cf-6bc2f7128eba" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 713.734914] env[68617]: DEBUG nova.network.neutron [req-b90207b4-c37e-4c35-8e37-1a89b3204e86 req-5162e7f7-107f-4158-a931-3d4a2b9d2f39 service nova] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Refreshing network info cache for port 1d043def-bc38-40fd-85ba-95148f129598 {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 714.032618] env[68617]: DEBUG oslo_concurrency.lockutils [None req-8673ebc2-02bf-4b52-87c4-05d73ef56ad6 tempest-VolumesAdminNegativeTest-561724217 tempest-VolumesAdminNegativeTest-561724217-project-member] Acquiring lock "152f9e1d-dd1b-486f-94b8-8202c0f2d335" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 714.032872] env[68617]: DEBUG oslo_concurrency.lockutils [None req-8673ebc2-02bf-4b52-87c4-05d73ef56ad6 tempest-VolumesAdminNegativeTest-561724217 tempest-VolumesAdminNegativeTest-561724217-project-member] Lock "152f9e1d-dd1b-486f-94b8-8202c0f2d335" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 714.819404] env[68617]: DEBUG nova.network.neutron [req-b90207b4-c37e-4c35-8e37-1a89b3204e86 req-5162e7f7-107f-4158-a931-3d4a2b9d2f39 service nova] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Updated VIF entry in instance network info cache for port 1d043def-bc38-40fd-85ba-95148f129598. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 714.819404] env[68617]: DEBUG nova.network.neutron [req-b90207b4-c37e-4c35-8e37-1a89b3204e86 req-5162e7f7-107f-4158-a931-3d4a2b9d2f39 service nova] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Updating instance_info_cache with network_info: [{"id": "1d043def-bc38-40fd-85ba-95148f129598", "address": "fa:16:3e:61:7b:9d", "network": {"id": "e3aee9db-8596-4ea8-943e-5c365382ee22", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.89", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "f1a3ab6230dd468b8019424ce71de8ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cde23701-02ca-4cb4-b5a6-d321f8ac9660", "external-id": "nsx-vlan-transportzone-586", "segmentation_id": 586, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1d043def-bc", "ovs_interfaceid": "1d043def-bc38-40fd-85ba-95148f129598", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 714.830525] env[68617]: DEBUG oslo_concurrency.lockutils [req-b90207b4-c37e-4c35-8e37-1a89b3204e86 req-5162e7f7-107f-4158-a931-3d4a2b9d2f39 service nova] Releasing lock "refresh_cache-050e2b27-1311-4a9a-b5cf-6bc2f7128eba" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 714.922228] env[68617]: DEBUG nova.compute.manager [req-40cd6224-7348-4993-85ae-07e61be9c21b req-0d8a60b5-c343-418c-98a9-ab1a8f39655e service nova] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Received event network-vif-plugged-3cce742a-a0fb-42cc-b3fa-ac15c3c48765 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 714.922441] env[68617]: DEBUG oslo_concurrency.lockutils [req-40cd6224-7348-4993-85ae-07e61be9c21b req-0d8a60b5-c343-418c-98a9-ab1a8f39655e service nova] Acquiring lock "6300077d-5aa7-4794-8ba2-1ec30151c15c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 714.922644] env[68617]: DEBUG oslo_concurrency.lockutils [req-40cd6224-7348-4993-85ae-07e61be9c21b req-0d8a60b5-c343-418c-98a9-ab1a8f39655e service nova] Lock "6300077d-5aa7-4794-8ba2-1ec30151c15c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 714.922807] env[68617]: DEBUG oslo_concurrency.lockutils [req-40cd6224-7348-4993-85ae-07e61be9c21b req-0d8a60b5-c343-418c-98a9-ab1a8f39655e service nova] Lock "6300077d-5aa7-4794-8ba2-1ec30151c15c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 714.922972] env[68617]: DEBUG nova.compute.manager [req-40cd6224-7348-4993-85ae-07e61be9c21b req-0d8a60b5-c343-418c-98a9-ab1a8f39655e service nova] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] No waiting events found dispatching network-vif-plugged-3cce742a-a0fb-42cc-b3fa-ac15c3c48765 {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 714.923569] env[68617]: WARNING nova.compute.manager [req-40cd6224-7348-4993-85ae-07e61be9c21b req-0d8a60b5-c343-418c-98a9-ab1a8f39655e service nova] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Received unexpected event network-vif-plugged-3cce742a-a0fb-42cc-b3fa-ac15c3c48765 for instance with vm_state building and task_state spawning. [ 714.953632] env[68617]: DEBUG nova.compute.manager [req-eae454d4-baf6-4717-8899-b16be7da1161 req-b14078de-ada2-4459-904c-4b47b4eb7b62 service nova] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Received event network-changed-66c72d22-142a-4cfd-9b6f-4569f3c69765 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 714.953879] env[68617]: DEBUG nova.compute.manager [req-eae454d4-baf6-4717-8899-b16be7da1161 req-b14078de-ada2-4459-904c-4b47b4eb7b62 service nova] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Refreshing instance network info cache due to event network-changed-66c72d22-142a-4cfd-9b6f-4569f3c69765. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 714.954149] env[68617]: DEBUG oslo_concurrency.lockutils [req-eae454d4-baf6-4717-8899-b16be7da1161 req-b14078de-ada2-4459-904c-4b47b4eb7b62 service nova] Acquiring lock "refresh_cache-f13242a0-7e65-4d68-a317-16fb8c4b8f8a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 714.954333] env[68617]: DEBUG oslo_concurrency.lockutils [req-eae454d4-baf6-4717-8899-b16be7da1161 req-b14078de-ada2-4459-904c-4b47b4eb7b62 service nova] Acquired lock "refresh_cache-f13242a0-7e65-4d68-a317-16fb8c4b8f8a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 714.955093] env[68617]: DEBUG nova.network.neutron [req-eae454d4-baf6-4717-8899-b16be7da1161 req-b14078de-ada2-4459-904c-4b47b4eb7b62 service nova] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Refreshing network info cache for port 66c72d22-142a-4cfd-9b6f-4569f3c69765 {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 714.957744] env[68617]: DEBUG oslo_concurrency.lockutils [None req-cc7683d3-a14d-40a7-9a16-b134b3aec2f0 tempest-ServerShowV254Test-70051388 tempest-ServerShowV254Test-70051388-project-member] Acquiring lock "9d10a63c-4c97-48c3-aca8-fd317aa2fbe7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 714.957993] env[68617]: DEBUG oslo_concurrency.lockutils [None req-cc7683d3-a14d-40a7-9a16-b134b3aec2f0 tempest-ServerShowV254Test-70051388 tempest-ServerShowV254Test-70051388-project-member] Lock "9d10a63c-4c97-48c3-aca8-fd317aa2fbe7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 715.737348] env[68617]: DEBUG nova.network.neutron [req-eae454d4-baf6-4717-8899-b16be7da1161 req-b14078de-ada2-4459-904c-4b47b4eb7b62 service nova] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Updated VIF entry in instance network info cache for port 66c72d22-142a-4cfd-9b6f-4569f3c69765. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 715.737348] env[68617]: DEBUG nova.network.neutron [req-eae454d4-baf6-4717-8899-b16be7da1161 req-b14078de-ada2-4459-904c-4b47b4eb7b62 service nova] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Updating instance_info_cache with network_info: [{"id": "66c72d22-142a-4cfd-9b6f-4569f3c69765", "address": "fa:16:3e:05:1e:b9", "network": {"id": "0f2e6893-43e2-458a-8326-dd03f1a6b1a7", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2034507765-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "828be21ced7d4d11a462ae49d04280ba", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "49b5df12-d801-4140-8816-2fd401608c7d", "external-id": "nsx-vlan-transportzone-326", "segmentation_id": 326, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap66c72d22-14", "ovs_interfaceid": "66c72d22-142a-4cfd-9b6f-4569f3c69765", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 715.754705] env[68617]: DEBUG oslo_concurrency.lockutils [req-eae454d4-baf6-4717-8899-b16be7da1161 req-b14078de-ada2-4459-904c-4b47b4eb7b62 service nova] Releasing lock "refresh_cache-f13242a0-7e65-4d68-a317-16fb8c4b8f8a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 715.755501] env[68617]: DEBUG nova.compute.manager [req-eae454d4-baf6-4717-8899-b16be7da1161 req-b14078de-ada2-4459-904c-4b47b4eb7b62 service nova] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Received event network-changed-3cce742a-a0fb-42cc-b3fa-ac15c3c48765 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 715.755501] env[68617]: DEBUG nova.compute.manager [req-eae454d4-baf6-4717-8899-b16be7da1161 req-b14078de-ada2-4459-904c-4b47b4eb7b62 service nova] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Refreshing instance network info cache due to event network-changed-3cce742a-a0fb-42cc-b3fa-ac15c3c48765. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 715.755501] env[68617]: DEBUG oslo_concurrency.lockutils [req-eae454d4-baf6-4717-8899-b16be7da1161 req-b14078de-ada2-4459-904c-4b47b4eb7b62 service nova] Acquiring lock "refresh_cache-6300077d-5aa7-4794-8ba2-1ec30151c15c" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 715.755501] env[68617]: DEBUG oslo_concurrency.lockutils [req-eae454d4-baf6-4717-8899-b16be7da1161 req-b14078de-ada2-4459-904c-4b47b4eb7b62 service nova] Acquired lock "refresh_cache-6300077d-5aa7-4794-8ba2-1ec30151c15c" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 715.755701] env[68617]: DEBUG nova.network.neutron [req-eae454d4-baf6-4717-8899-b16be7da1161 req-b14078de-ada2-4459-904c-4b47b4eb7b62 service nova] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Refreshing network info cache for port 3cce742a-a0fb-42cc-b3fa-ac15c3c48765 {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 716.657143] env[68617]: DEBUG nova.network.neutron [req-eae454d4-baf6-4717-8899-b16be7da1161 req-b14078de-ada2-4459-904c-4b47b4eb7b62 service nova] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Updated VIF entry in instance network info cache for port 3cce742a-a0fb-42cc-b3fa-ac15c3c48765. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 716.657490] env[68617]: DEBUG nova.network.neutron [req-eae454d4-baf6-4717-8899-b16be7da1161 req-b14078de-ada2-4459-904c-4b47b4eb7b62 service nova] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Updating instance_info_cache with network_info: [{"id": "3cce742a-a0fb-42cc-b3fa-ac15c3c48765", "address": "fa:16:3e:8d:23:bd", "network": {"id": "e3aee9db-8596-4ea8-943e-5c365382ee22", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.239", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "f1a3ab6230dd468b8019424ce71de8ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cde23701-02ca-4cb4-b5a6-d321f8ac9660", "external-id": "nsx-vlan-transportzone-586", "segmentation_id": 586, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3cce742a-a0", "ovs_interfaceid": "3cce742a-a0fb-42cc-b3fa-ac15c3c48765", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 716.671489] env[68617]: DEBUG oslo_concurrency.lockutils [req-eae454d4-baf6-4717-8899-b16be7da1161 req-b14078de-ada2-4459-904c-4b47b4eb7b62 service nova] Releasing lock "refresh_cache-6300077d-5aa7-4794-8ba2-1ec30151c15c" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 716.671489] env[68617]: DEBUG nova.compute.manager [req-eae454d4-baf6-4717-8899-b16be7da1161 req-b14078de-ada2-4459-904c-4b47b4eb7b62 service nova] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Received event network-vif-plugged-106b3564-fd1a-4802-8825-cec0ac29771e {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 716.671901] env[68617]: DEBUG oslo_concurrency.lockutils [req-eae454d4-baf6-4717-8899-b16be7da1161 req-b14078de-ada2-4459-904c-4b47b4eb7b62 service nova] Acquiring lock "4ea5887f-84bd-4629-b568-e73c78af0ad4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 716.672198] env[68617]: DEBUG oslo_concurrency.lockutils [req-eae454d4-baf6-4717-8899-b16be7da1161 req-b14078de-ada2-4459-904c-4b47b4eb7b62 service nova] Lock "4ea5887f-84bd-4629-b568-e73c78af0ad4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 716.672397] env[68617]: DEBUG oslo_concurrency.lockutils [req-eae454d4-baf6-4717-8899-b16be7da1161 req-b14078de-ada2-4459-904c-4b47b4eb7b62 service nova] Lock "4ea5887f-84bd-4629-b568-e73c78af0ad4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 716.672577] env[68617]: DEBUG nova.compute.manager [req-eae454d4-baf6-4717-8899-b16be7da1161 req-b14078de-ada2-4459-904c-4b47b4eb7b62 service nova] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] No waiting events found dispatching network-vif-plugged-106b3564-fd1a-4802-8825-cec0ac29771e {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 716.672750] env[68617]: WARNING nova.compute.manager [req-eae454d4-baf6-4717-8899-b16be7da1161 req-b14078de-ada2-4459-904c-4b47b4eb7b62 service nova] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Received unexpected event network-vif-plugged-106b3564-fd1a-4802-8825-cec0ac29771e for instance with vm_state building and task_state spawning. [ 716.672917] env[68617]: DEBUG nova.compute.manager [req-eae454d4-baf6-4717-8899-b16be7da1161 req-b14078de-ada2-4459-904c-4b47b4eb7b62 service nova] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Received event network-changed-106b3564-fd1a-4802-8825-cec0ac29771e {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 716.673109] env[68617]: DEBUG nova.compute.manager [req-eae454d4-baf6-4717-8899-b16be7da1161 req-b14078de-ada2-4459-904c-4b47b4eb7b62 service nova] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Refreshing instance network info cache due to event network-changed-106b3564-fd1a-4802-8825-cec0ac29771e. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 716.673313] env[68617]: DEBUG oslo_concurrency.lockutils [req-eae454d4-baf6-4717-8899-b16be7da1161 req-b14078de-ada2-4459-904c-4b47b4eb7b62 service nova] Acquiring lock "refresh_cache-4ea5887f-84bd-4629-b568-e73c78af0ad4" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 716.673450] env[68617]: DEBUG oslo_concurrency.lockutils [req-eae454d4-baf6-4717-8899-b16be7da1161 req-b14078de-ada2-4459-904c-4b47b4eb7b62 service nova] Acquired lock "refresh_cache-4ea5887f-84bd-4629-b568-e73c78af0ad4" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 716.673606] env[68617]: DEBUG nova.network.neutron [req-eae454d4-baf6-4717-8899-b16be7da1161 req-b14078de-ada2-4459-904c-4b47b4eb7b62 service nova] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Refreshing network info cache for port 106b3564-fd1a-4802-8825-cec0ac29771e {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 717.413227] env[68617]: DEBUG nova.network.neutron [req-eae454d4-baf6-4717-8899-b16be7da1161 req-b14078de-ada2-4459-904c-4b47b4eb7b62 service nova] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Updated VIF entry in instance network info cache for port 106b3564-fd1a-4802-8825-cec0ac29771e. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 717.413227] env[68617]: DEBUG nova.network.neutron [req-eae454d4-baf6-4717-8899-b16be7da1161 req-b14078de-ada2-4459-904c-4b47b4eb7b62 service nova] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Updating instance_info_cache with network_info: [{"id": "106b3564-fd1a-4802-8825-cec0ac29771e", "address": "fa:16:3e:d3:1f:a0", "network": {"id": "0f2e6893-43e2-458a-8326-dd03f1a6b1a7", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2034507765-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "828be21ced7d4d11a462ae49d04280ba", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "49b5df12-d801-4140-8816-2fd401608c7d", "external-id": "nsx-vlan-transportzone-326", "segmentation_id": 326, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap106b3564-fd", "ovs_interfaceid": "106b3564-fd1a-4802-8825-cec0ac29771e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 717.434034] env[68617]: DEBUG oslo_concurrency.lockutils [req-eae454d4-baf6-4717-8899-b16be7da1161 req-b14078de-ada2-4459-904c-4b47b4eb7b62 service nova] Releasing lock "refresh_cache-4ea5887f-84bd-4629-b568-e73c78af0ad4" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 721.580084] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5f4b9570-f02c-433a-af31-90737f90adf7 tempest-ImagesNegativeTestJSON-780312143 tempest-ImagesNegativeTestJSON-780312143-project-member] Acquiring lock "e6e6c910-9485-48b0-bffa-4534cd7f87d4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 721.580351] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5f4b9570-f02c-433a-af31-90737f90adf7 tempest-ImagesNegativeTestJSON-780312143 tempest-ImagesNegativeTestJSON-780312143-project-member] Lock "e6e6c910-9485-48b0-bffa-4534cd7f87d4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 722.874275] env[68617]: DEBUG oslo_concurrency.lockutils [None req-87039a0c-e740-428e-b486-a4fc387bd6d0 tempest-ServersTestMultiNic-884689889 tempest-ServersTestMultiNic-884689889-project-member] Acquiring lock "1ec954d1-1bc9-4db3-9a48-7da759cebf21" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 722.874615] env[68617]: DEBUG oslo_concurrency.lockutils [None req-87039a0c-e740-428e-b486-a4fc387bd6d0 tempest-ServersTestMultiNic-884689889 tempest-ServersTestMultiNic-884689889-project-member] Lock "1ec954d1-1bc9-4db3-9a48-7da759cebf21" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 723.415013] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ac06e194-790a-4b22-a986-6f23dcf296da tempest-VolumesAssistedSnapshotsTest-1080420425 tempest-VolumesAssistedSnapshotsTest-1080420425-project-member] Acquiring lock "7d51d3c0-12fd-4118-80c6-16c1cca346db" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 723.415492] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ac06e194-790a-4b22-a986-6f23dcf296da tempest-VolumesAssistedSnapshotsTest-1080420425 tempest-VolumesAssistedSnapshotsTest-1080420425-project-member] Lock "7d51d3c0-12fd-4118-80c6-16c1cca346db" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 725.589537] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5d634d33-51a2-4f85-8593-a8501573f884 tempest-SecurityGroupsTestJSON-1069621129 tempest-SecurityGroupsTestJSON-1069621129-project-member] Acquiring lock "a43cf82a-c969-47eb-b8dc-d7fe7f7870d3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 725.590212] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5d634d33-51a2-4f85-8593-a8501573f884 tempest-SecurityGroupsTestJSON-1069621129 tempest-SecurityGroupsTestJSON-1069621129-project-member] Lock "a43cf82a-c969-47eb-b8dc-d7fe7f7870d3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 728.619580] env[68617]: DEBUG oslo_concurrency.lockutils [None req-94dbeb25-06a4-414d-8622-f98526281db2 tempest-ServersWithSpecificFlavorTestJSON-843794936 tempest-ServersWithSpecificFlavorTestJSON-843794936-project-member] Acquiring lock "40c6521b-51d9-45cf-959c-21e4f3da7eb9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 728.619580] env[68617]: DEBUG oslo_concurrency.lockutils [None req-94dbeb25-06a4-414d-8622-f98526281db2 tempest-ServersWithSpecificFlavorTestJSON-843794936 tempest-ServersWithSpecificFlavorTestJSON-843794936-project-member] Lock "40c6521b-51d9-45cf-959c-21e4f3da7eb9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 730.997552] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ec57737b-ea16-4b83-9bb4-b9d43e9aef52 tempest-MigrationsAdminTest-1112293401 tempest-MigrationsAdminTest-1112293401-project-member] Acquiring lock "dae068af-0c54-4715-bdc3-ecfd018b6294" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 730.997860] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ec57737b-ea16-4b83-9bb4-b9d43e9aef52 tempest-MigrationsAdminTest-1112293401 tempest-MigrationsAdminTest-1112293401-project-member] Lock "dae068af-0c54-4715-bdc3-ecfd018b6294" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 731.445399] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ae66108a-45b4-4490-93f1-08beff341940 tempest-InstanceActionsNegativeTestJSON-1283463967 tempest-InstanceActionsNegativeTestJSON-1283463967-project-member] Acquiring lock "ee6e18cd-9af2-4440-8336-9e1858c28709" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 731.445641] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ae66108a-45b4-4490-93f1-08beff341940 tempest-InstanceActionsNegativeTestJSON-1283463967 tempest-InstanceActionsNegativeTestJSON-1283463967-project-member] Lock "ee6e18cd-9af2-4440-8336-9e1858c28709" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 738.766590] env[68617]: WARNING oslo_vmware.rw_handles [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 738.766590] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 738.766590] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 738.766590] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 738.766590] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 738.766590] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 738.766590] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 738.766590] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 738.766590] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 738.766590] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 738.766590] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 738.766590] env[68617]: ERROR oslo_vmware.rw_handles [ 738.767193] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/135b78cd-b10a-4932-b77a-56939c307f08/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 738.770034] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 738.770682] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Copying Virtual Disk [datastore2] vmware_temp/135b78cd-b10a-4932-b77a-56939c307f08/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/135b78cd-b10a-4932-b77a-56939c307f08/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 738.770797] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-94df1e84-7391-46d6-b798-8c7f46f4c6b6 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 738.780401] env[68617]: DEBUG oslo_vmware.api [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Waiting for the task: (returnval){ [ 738.780401] env[68617]: value = "task-3470712" [ 738.780401] env[68617]: _type = "Task" [ 738.780401] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 738.791550] env[68617]: DEBUG oslo_vmware.api [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Task: {'id': task-3470712, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 739.292767] env[68617]: DEBUG oslo_vmware.exceptions [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 739.293125] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 739.296516] env[68617]: ERROR nova.compute.manager [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 739.296516] env[68617]: Faults: ['InvalidArgument'] [ 739.296516] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Traceback (most recent call last): [ 739.296516] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 739.296516] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] yield resources [ 739.296516] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 739.296516] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] self.driver.spawn(context, instance, image_meta, [ 739.296516] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 739.296516] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 739.296516] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 739.296516] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] self._fetch_image_if_missing(context, vi) [ 739.296516] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 739.297278] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] image_cache(vi, tmp_image_ds_loc) [ 739.297278] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 739.297278] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] vm_util.copy_virtual_disk( [ 739.297278] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 739.297278] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] session._wait_for_task(vmdk_copy_task) [ 739.297278] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 739.297278] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] return self.wait_for_task(task_ref) [ 739.297278] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 739.297278] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] return evt.wait() [ 739.297278] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 739.297278] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] result = hub.switch() [ 739.297278] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 739.297278] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] return self.greenlet.switch() [ 739.297655] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 739.297655] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] self.f(*self.args, **self.kw) [ 739.297655] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 739.297655] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] raise exceptions.translate_fault(task_info.error) [ 739.297655] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 739.297655] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Faults: ['InvalidArgument'] [ 739.297655] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] [ 739.297655] env[68617]: INFO nova.compute.manager [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Terminating instance [ 739.298492] env[68617]: DEBUG oslo_concurrency.lockutils [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 739.298761] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 739.299901] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Acquiring lock "refresh_cache-26f6016f-5fb5-4fd2-9ee3-648297d969b3" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 739.300080] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Acquired lock "refresh_cache-26f6016f-5fb5-4fd2-9ee3-648297d969b3" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 739.300271] env[68617]: DEBUG nova.network.neutron [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 739.301840] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-253e2441-46ea-4127-a830-1bdd189f9fc1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 739.310338] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 739.310544] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 739.311887] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-97c808d9-9050-41df-b0a0-1008367376be {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 739.321146] env[68617]: DEBUG oslo_vmware.api [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Waiting for the task: (returnval){ [ 739.321146] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]5268b3a3-9a75-6268-6560-6ba0b5d8ffca" [ 739.321146] env[68617]: _type = "Task" [ 739.321146] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 739.329258] env[68617]: DEBUG oslo_vmware.api [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]5268b3a3-9a75-6268-6560-6ba0b5d8ffca, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 739.350992] env[68617]: DEBUG nova.network.neutron [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 739.734374] env[68617]: DEBUG nova.network.neutron [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 739.747165] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Releasing lock "refresh_cache-26f6016f-5fb5-4fd2-9ee3-648297d969b3" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 739.748237] env[68617]: DEBUG nova.compute.manager [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 739.748447] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 739.749580] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9dce4b6b-2be3-492a-9549-0ab413fb1069 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 739.759809] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 739.760160] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6a2906a1-2a2e-4792-9f71-40feaa1abce4 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 739.799837] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 739.800093] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 739.800221] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Deleting the datastore file [datastore2] 26f6016f-5fb5-4fd2-9ee3-648297d969b3 {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 739.800533] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f8200f63-435e-4de7-b5a3-d93be09deefa {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 739.810601] env[68617]: DEBUG oslo_vmware.api [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Waiting for the task: (returnval){ [ 739.810601] env[68617]: value = "task-3470714" [ 739.810601] env[68617]: _type = "Task" [ 739.810601] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 739.819587] env[68617]: DEBUG oslo_vmware.api [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Task: {'id': task-3470714, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 739.832371] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 739.832877] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Creating directory with path [datastore2] vmware_temp/4380e46a-eb34-435c-a0de-2e6d79e694e7/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 739.832877] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-888a47ee-47ad-4c9c-951b-cce322e4df88 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 739.851498] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Created directory with path [datastore2] vmware_temp/4380e46a-eb34-435c-a0de-2e6d79e694e7/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 739.853033] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Fetch image to [datastore2] vmware_temp/4380e46a-eb34-435c-a0de-2e6d79e694e7/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 739.853033] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/4380e46a-eb34-435c-a0de-2e6d79e694e7/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 739.853033] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52244471-1fc6-49aa-b7f9-3c31f35b8433 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 739.860381] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e232e91-4332-4eab-9efe-d8ab35d328fd {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 739.869460] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa7385db-152f-4f66-b188-644039268790 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 739.908331] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78a9c9df-71bb-469a-9218-e6a919cf076f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 739.916721] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-01d5a36d-1948-4b8a-a56a-817779723be8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 739.946660] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 740.018030] env[68617]: DEBUG oslo_vmware.rw_handles [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4380e46a-eb34-435c-a0de-2e6d79e694e7/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 740.082461] env[68617]: DEBUG oslo_vmware.rw_handles [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 740.082675] env[68617]: DEBUG oslo_vmware.rw_handles [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4380e46a-eb34-435c-a0de-2e6d79e694e7/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 740.320667] env[68617]: DEBUG oslo_vmware.api [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Task: {'id': task-3470714, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.033945} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 740.321379] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 740.324144] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 740.324144] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 740.324144] env[68617]: INFO nova.compute.manager [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Took 0.57 seconds to destroy the instance on the hypervisor. [ 740.324144] env[68617]: DEBUG oslo.service.loopingcall [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 740.324144] env[68617]: DEBUG nova.compute.manager [-] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Skipping network deallocation for instance since networking was not requested. {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 740.325389] env[68617]: DEBUG nova.compute.claims [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 740.325700] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 740.326153] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 740.725575] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad661020-b372-45c2-a977-46a3add4ae89 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 740.735611] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b67845b-27e3-4e35-9bdf-6a99369a00f2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 740.773019] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0efdf544-1b2e-440a-84cd-6238c8b42a2e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 740.778603] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6a5a436-0e84-4f7b-a4b0-5b43bf0366d9 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 740.795901] env[68617]: DEBUG nova.compute.provider_tree [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 740.809509] env[68617]: DEBUG nova.scheduler.client.report [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 740.842032] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.514s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 740.842032] env[68617]: ERROR nova.compute.manager [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 740.842032] env[68617]: Faults: ['InvalidArgument'] [ 740.842032] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Traceback (most recent call last): [ 740.842032] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 740.842032] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] self.driver.spawn(context, instance, image_meta, [ 740.842032] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 740.842032] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 740.842032] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 740.842032] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] self._fetch_image_if_missing(context, vi) [ 740.842398] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 740.842398] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] image_cache(vi, tmp_image_ds_loc) [ 740.842398] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 740.842398] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] vm_util.copy_virtual_disk( [ 740.842398] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 740.842398] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] session._wait_for_task(vmdk_copy_task) [ 740.842398] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 740.842398] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] return self.wait_for_task(task_ref) [ 740.842398] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 740.842398] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] return evt.wait() [ 740.842398] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 740.842398] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] result = hub.switch() [ 740.842398] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 740.843105] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] return self.greenlet.switch() [ 740.843105] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 740.843105] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] self.f(*self.args, **self.kw) [ 740.843105] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 740.843105] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] raise exceptions.translate_fault(task_info.error) [ 740.843105] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 740.843105] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Faults: ['InvalidArgument'] [ 740.843105] env[68617]: ERROR nova.compute.manager [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] [ 740.843105] env[68617]: DEBUG nova.compute.utils [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 740.846394] env[68617]: DEBUG nova.compute.manager [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Build of instance 26f6016f-5fb5-4fd2-9ee3-648297d969b3 was re-scheduled: A specified parameter was not correct: fileType [ 740.846394] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 740.846723] env[68617]: DEBUG nova.compute.manager [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 740.846948] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Acquiring lock "refresh_cache-26f6016f-5fb5-4fd2-9ee3-648297d969b3" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 740.847249] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Acquired lock "refresh_cache-26f6016f-5fb5-4fd2-9ee3-648297d969b3" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 740.847249] env[68617]: DEBUG nova.network.neutron [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 740.879800] env[68617]: DEBUG nova.network.neutron [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 740.977729] env[68617]: DEBUG nova.network.neutron [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 740.990393] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Releasing lock "refresh_cache-26f6016f-5fb5-4fd2-9ee3-648297d969b3" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 740.990612] env[68617]: DEBUG nova.compute.manager [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 740.990789] env[68617]: DEBUG nova.compute.manager [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] [instance: 26f6016f-5fb5-4fd2-9ee3-648297d969b3] Skipping network deallocation for instance since networking was not requested. {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 741.117847] env[68617]: INFO nova.scheduler.client.report [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Deleted allocations for instance 26f6016f-5fb5-4fd2-9ee3-648297d969b3 [ 741.147630] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5dcfdb7d-fe22-4b30-8cce-98a942fb4459 tempest-ServersAdmin275Test-1319113762 tempest-ServersAdmin275Test-1319113762-project-member] Lock "26f6016f-5fb5-4fd2-9ee3-648297d969b3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 51.541s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 741.183621] env[68617]: DEBUG nova.compute.manager [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 741.262051] env[68617]: DEBUG oslo_concurrency.lockutils [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 741.262051] env[68617]: DEBUG oslo_concurrency.lockutils [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 741.262051] env[68617]: INFO nova.compute.claims [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 741.708149] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-477421f1-2341-45c4-ade2-64ff90f98b1d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 741.717975] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bcd0432-71c6-41db-9cd4-181e8feeb800 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 741.754918] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73b26d9d-8e2a-4f77-a549-b7cdc31c0167 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 741.764672] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14cc304b-3159-412b-bb29-6d713a535f06 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 741.782527] env[68617]: DEBUG nova.compute.provider_tree [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 741.795135] env[68617]: DEBUG nova.scheduler.client.report [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 741.819306] env[68617]: DEBUG oslo_concurrency.lockutils [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.559s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 741.819952] env[68617]: DEBUG nova.compute.manager [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 741.873904] env[68617]: DEBUG nova.compute.utils [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 741.875435] env[68617]: DEBUG nova.compute.manager [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 741.878828] env[68617]: DEBUG nova.network.neutron [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 741.890736] env[68617]: DEBUG nova.compute.manager [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 741.974018] env[68617]: DEBUG nova.policy [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'be1fb3906fa449949fc0b5eae9cab9fb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1e11c4e5c25a42119594647403c0199b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 741.990557] env[68617]: DEBUG nova.compute.manager [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 742.025536] env[68617]: DEBUG nova.virt.hardware [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 742.025536] env[68617]: DEBUG nova.virt.hardware [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 742.025536] env[68617]: DEBUG nova.virt.hardware [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 742.025963] env[68617]: DEBUG nova.virt.hardware [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 742.025963] env[68617]: DEBUG nova.virt.hardware [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 742.025963] env[68617]: DEBUG nova.virt.hardware [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 742.025963] env[68617]: DEBUG nova.virt.hardware [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 742.025963] env[68617]: DEBUG nova.virt.hardware [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 742.026121] env[68617]: DEBUG nova.virt.hardware [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 742.026121] env[68617]: DEBUG nova.virt.hardware [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 742.026121] env[68617]: DEBUG nova.virt.hardware [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 742.026121] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55af4f5d-6787-438a-aa30-2b62919c57ab {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 742.036903] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60ef9f04-d0be-49b3-9a94-975f6c916d1b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 742.366051] env[68617]: DEBUG nova.network.neutron [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Successfully created port: af50c01d-82ed-46bb-9f73-44cda11ecaa1 {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 743.072094] env[68617]: DEBUG nova.network.neutron [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Successfully updated port: af50c01d-82ed-46bb-9f73-44cda11ecaa1 {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 743.085312] env[68617]: DEBUG oslo_concurrency.lockutils [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "refresh_cache-6eef6e24-cf49-458b-ae37-8da4e02045f8" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 743.085312] env[68617]: DEBUG oslo_concurrency.lockutils [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquired lock "refresh_cache-6eef6e24-cf49-458b-ae37-8da4e02045f8" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 743.085425] env[68617]: DEBUG nova.network.neutron [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 743.157435] env[68617]: DEBUG nova.network.neutron [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 743.534019] env[68617]: DEBUG nova.network.neutron [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Updating instance_info_cache with network_info: [{"id": "af50c01d-82ed-46bb-9f73-44cda11ecaa1", "address": "fa:16:3e:94:49:a2", "network": {"id": "1d9c32bb-1c81-4af6-8d3f-365a52df11cd", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-313904480-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1e11c4e5c25a42119594647403c0199b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d62c1cf-f39a-4626-9552-f1e13c692636", "external-id": "nsx-vlan-transportzone-748", "segmentation_id": 748, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapaf50c01d-82", "ovs_interfaceid": "af50c01d-82ed-46bb-9f73-44cda11ecaa1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 743.549194] env[68617]: DEBUG oslo_concurrency.lockutils [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Releasing lock "refresh_cache-6eef6e24-cf49-458b-ae37-8da4e02045f8" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 743.549487] env[68617]: DEBUG nova.compute.manager [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Instance network_info: |[{"id": "af50c01d-82ed-46bb-9f73-44cda11ecaa1", "address": "fa:16:3e:94:49:a2", "network": {"id": "1d9c32bb-1c81-4af6-8d3f-365a52df11cd", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-313904480-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1e11c4e5c25a42119594647403c0199b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d62c1cf-f39a-4626-9552-f1e13c692636", "external-id": "nsx-vlan-transportzone-748", "segmentation_id": 748, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapaf50c01d-82", "ovs_interfaceid": "af50c01d-82ed-46bb-9f73-44cda11ecaa1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 743.549878] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:94:49:a2', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6d62c1cf-f39a-4626-9552-f1e13c692636', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'af50c01d-82ed-46bb-9f73-44cda11ecaa1', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 743.559088] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Creating folder: Project (1e11c4e5c25a42119594647403c0199b). Parent ref: group-v693691. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 743.560229] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e8ffbd40-0ec3-4798-81b5-9c725cab0c9e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 743.572637] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Created folder: Project (1e11c4e5c25a42119594647403c0199b) in parent group-v693691. [ 743.573231] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Creating folder: Instances. Parent ref: group-v693720. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 743.573568] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a83fa947-6331-4119-a285-7c80b4433253 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 743.583487] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Created folder: Instances in parent group-v693720. [ 743.583755] env[68617]: DEBUG oslo.service.loopingcall [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 743.583942] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 743.584197] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a35febbc-2c49-4271-9364-95b707c609fd {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 743.608232] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 743.608232] env[68617]: value = "task-3470717" [ 743.608232] env[68617]: _type = "Task" [ 743.608232] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 743.622535] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470717, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 744.119887] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470717, 'name': CreateVM_Task, 'duration_secs': 0.307348} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 744.120760] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 744.120802] env[68617]: DEBUG oslo_concurrency.lockutils [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 744.120942] env[68617]: DEBUG oslo_concurrency.lockutils [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 744.121272] env[68617]: DEBUG oslo_concurrency.lockutils [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 744.122328] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bb739dd1-9b72-4d59-bad4-ba32c4ecc408 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 744.127230] env[68617]: DEBUG oslo_vmware.api [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Waiting for the task: (returnval){ [ 744.127230] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]529c6c55-fd35-b26b-7825-af5f3ce2c827" [ 744.127230] env[68617]: _type = "Task" [ 744.127230] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 744.135857] env[68617]: DEBUG oslo_vmware.api [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]529c6c55-fd35-b26b-7825-af5f3ce2c827, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 744.638175] env[68617]: DEBUG oslo_concurrency.lockutils [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 744.638787] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 744.638787] env[68617]: DEBUG oslo_concurrency.lockutils [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 744.779048] env[68617]: DEBUG nova.compute.manager [req-cf9e2d34-d5e3-453b-a5a8-7763fb9ddd00 req-46f9a482-85f8-41aa-9a1e-40161431ae28 service nova] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Received event network-vif-plugged-af50c01d-82ed-46bb-9f73-44cda11ecaa1 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 744.779269] env[68617]: DEBUG oslo_concurrency.lockutils [req-cf9e2d34-d5e3-453b-a5a8-7763fb9ddd00 req-46f9a482-85f8-41aa-9a1e-40161431ae28 service nova] Acquiring lock "6eef6e24-cf49-458b-ae37-8da4e02045f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 744.779474] env[68617]: DEBUG oslo_concurrency.lockutils [req-cf9e2d34-d5e3-453b-a5a8-7763fb9ddd00 req-46f9a482-85f8-41aa-9a1e-40161431ae28 service nova] Lock "6eef6e24-cf49-458b-ae37-8da4e02045f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 744.779670] env[68617]: DEBUG oslo_concurrency.lockutils [req-cf9e2d34-d5e3-453b-a5a8-7763fb9ddd00 req-46f9a482-85f8-41aa-9a1e-40161431ae28 service nova] Lock "6eef6e24-cf49-458b-ae37-8da4e02045f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 744.780116] env[68617]: DEBUG nova.compute.manager [req-cf9e2d34-d5e3-453b-a5a8-7763fb9ddd00 req-46f9a482-85f8-41aa-9a1e-40161431ae28 service nova] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] No waiting events found dispatching network-vif-plugged-af50c01d-82ed-46bb-9f73-44cda11ecaa1 {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 744.780116] env[68617]: WARNING nova.compute.manager [req-cf9e2d34-d5e3-453b-a5a8-7763fb9ddd00 req-46f9a482-85f8-41aa-9a1e-40161431ae28 service nova] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Received unexpected event network-vif-plugged-af50c01d-82ed-46bb-9f73-44cda11ecaa1 for instance with vm_state building and task_state spawning. [ 748.313432] env[68617]: DEBUG oslo_concurrency.lockutils [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "995585f5-57a4-4ba6-9e28-18a086af264c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 748.314146] env[68617]: DEBUG oslo_concurrency.lockutils [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "995585f5-57a4-4ba6-9e28-18a086af264c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 748.854179] env[68617]: DEBUG nova.compute.manager [req-81818987-6bcb-4dd5-b704-88057aefe2fc req-096d64ca-e182-4aa5-bea1-20d5c9961e9b service nova] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Received event network-changed-af50c01d-82ed-46bb-9f73-44cda11ecaa1 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 748.854179] env[68617]: DEBUG nova.compute.manager [req-81818987-6bcb-4dd5-b704-88057aefe2fc req-096d64ca-e182-4aa5-bea1-20d5c9961e9b service nova] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Refreshing instance network info cache due to event network-changed-af50c01d-82ed-46bb-9f73-44cda11ecaa1. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 748.854179] env[68617]: DEBUG oslo_concurrency.lockutils [req-81818987-6bcb-4dd5-b704-88057aefe2fc req-096d64ca-e182-4aa5-bea1-20d5c9961e9b service nova] Acquiring lock "refresh_cache-6eef6e24-cf49-458b-ae37-8da4e02045f8" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 748.854179] env[68617]: DEBUG oslo_concurrency.lockutils [req-81818987-6bcb-4dd5-b704-88057aefe2fc req-096d64ca-e182-4aa5-bea1-20d5c9961e9b service nova] Acquired lock "refresh_cache-6eef6e24-cf49-458b-ae37-8da4e02045f8" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 748.856822] env[68617]: DEBUG nova.network.neutron [req-81818987-6bcb-4dd5-b704-88057aefe2fc req-096d64ca-e182-4aa5-bea1-20d5c9961e9b service nova] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Refreshing network info cache for port af50c01d-82ed-46bb-9f73-44cda11ecaa1 {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 748.910374] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f3064b87-2ca0-4b92-b4a8-1120c3d2f60d tempest-AttachInterfacesTestJSON-753337404 tempest-AttachInterfacesTestJSON-753337404-project-member] Acquiring lock "53abf4e7-35ea-415b-8a90-a89442c475a1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 748.910652] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f3064b87-2ca0-4b92-b4a8-1120c3d2f60d tempest-AttachInterfacesTestJSON-753337404 tempest-AttachInterfacesTestJSON-753337404-project-member] Lock "53abf4e7-35ea-415b-8a90-a89442c475a1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 749.704162] env[68617]: DEBUG nova.network.neutron [req-81818987-6bcb-4dd5-b704-88057aefe2fc req-096d64ca-e182-4aa5-bea1-20d5c9961e9b service nova] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Updated VIF entry in instance network info cache for port af50c01d-82ed-46bb-9f73-44cda11ecaa1. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 749.704537] env[68617]: DEBUG nova.network.neutron [req-81818987-6bcb-4dd5-b704-88057aefe2fc req-096d64ca-e182-4aa5-bea1-20d5c9961e9b service nova] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Updating instance_info_cache with network_info: [{"id": "af50c01d-82ed-46bb-9f73-44cda11ecaa1", "address": "fa:16:3e:94:49:a2", "network": {"id": "1d9c32bb-1c81-4af6-8d3f-365a52df11cd", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-313904480-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1e11c4e5c25a42119594647403c0199b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d62c1cf-f39a-4626-9552-f1e13c692636", "external-id": "nsx-vlan-transportzone-748", "segmentation_id": 748, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapaf50c01d-82", "ovs_interfaceid": "af50c01d-82ed-46bb-9f73-44cda11ecaa1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 749.717564] env[68617]: DEBUG oslo_concurrency.lockutils [req-81818987-6bcb-4dd5-b704-88057aefe2fc req-096d64ca-e182-4aa5-bea1-20d5c9961e9b service nova] Releasing lock "refresh_cache-6eef6e24-cf49-458b-ae37-8da4e02045f8" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 750.411033] env[68617]: DEBUG oslo_concurrency.lockutils [None req-86503bab-b95f-46f2-ae42-9dadcd71b7ba tempest-ServerAddressesTestJSON-1323985713 tempest-ServerAddressesTestJSON-1323985713-project-member] Acquiring lock "8d0a643a-96c5-4d47-aa2a-9f777e80c259" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 750.411675] env[68617]: DEBUG oslo_concurrency.lockutils [None req-86503bab-b95f-46f2-ae42-9dadcd71b7ba tempest-ServerAddressesTestJSON-1323985713 tempest-ServerAddressesTestJSON-1323985713-project-member] Lock "8d0a643a-96c5-4d47-aa2a-9f777e80c259" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 751.214355] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c22f9f6b-d3b5-4e1b-a997-e31bb568d743 tempest-ServersTestJSON-1804119009 tempest-ServersTestJSON-1804119009-project-member] Acquiring lock "6341f8b6-9f42-4ab2-806f-dbad62de5376" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 751.214355] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c22f9f6b-d3b5-4e1b-a997-e31bb568d743 tempest-ServersTestJSON-1804119009 tempest-ServersTestJSON-1804119009-project-member] Lock "6341f8b6-9f42-4ab2-806f-dbad62de5376" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 751.426113] env[68617]: DEBUG oslo_concurrency.lockutils [None req-be884208-027a-4bdd-8f4b-0274fa0ae816 tempest-TenantUsagesTestJSON-2135587497 tempest-TenantUsagesTestJSON-2135587497-project-member] Acquiring lock "cc32959e-71ea-44cb-aebe-bf6a893ebb18" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 751.426425] env[68617]: DEBUG oslo_concurrency.lockutils [None req-be884208-027a-4bdd-8f4b-0274fa0ae816 tempest-TenantUsagesTestJSON-2135587497 tempest-TenantUsagesTestJSON-2135587497-project-member] Lock "cc32959e-71ea-44cb-aebe-bf6a893ebb18" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 753.817783] env[68617]: DEBUG oslo_concurrency.lockutils [None req-df3c1748-33bf-4848-8e70-a516d8f991e9 tempest-ImagesOneServerNegativeTestJSON-249197895 tempest-ImagesOneServerNegativeTestJSON-249197895-project-member] Acquiring lock "704e4d83-19ef-493a-a374-dce0de95e975" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 753.817783] env[68617]: DEBUG oslo_concurrency.lockutils [None req-df3c1748-33bf-4848-8e70-a516d8f991e9 tempest-ImagesOneServerNegativeTestJSON-249197895 tempest-ImagesOneServerNegativeTestJSON-249197895-project-member] Lock "704e4d83-19ef-493a-a374-dce0de95e975" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 759.287452] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9e1cc523-2db5-40d8-9590-d32bffc942bf tempest-ImagesTestJSON-918330909 tempest-ImagesTestJSON-918330909-project-member] Acquiring lock "df9a6dc4-abb5-4855-ac4f-5479dd0b6498" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 759.287769] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9e1cc523-2db5-40d8-9590-d32bffc942bf tempest-ImagesTestJSON-918330909 tempest-ImagesTestJSON-918330909-project-member] Lock "df9a6dc4-abb5-4855-ac4f-5479dd0b6498" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 759.968678] env[68617]: DEBUG oslo_concurrency.lockutils [None req-de31d373-1ac6-424b-8651-58ff44a68371 tempest-ServersAaction247Test-609120817 tempest-ServersAaction247Test-609120817-project-member] Acquiring lock "f6f64438-8279-4ff4-ab80-efd1e17d7e04" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 759.968678] env[68617]: DEBUG oslo_concurrency.lockutils [None req-de31d373-1ac6-424b-8651-58ff44a68371 tempest-ServersAaction247Test-609120817 tempest-ServersAaction247Test-609120817-project-member] Lock "f6f64438-8279-4ff4-ab80-efd1e17d7e04" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 761.557159] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 761.557489] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 761.588024] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 761.588024] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Starting heal instance info cache {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 761.588024] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Rebuilding the list of instances to heal {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 761.613745] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 761.613852] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 761.614033] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 761.614198] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 761.614356] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 761.614480] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 761.614597] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 761.614711] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 761.614825] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 761.614941] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 761.615071] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Didn't find any instances for network info cache update. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 761.615578] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 761.615744] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 761.616144] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 761.698961] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 761.699147] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68617) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 762.698786] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 762.699049] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 762.699196] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 762.711976] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 762.712298] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 762.712480] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 762.712722] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 762.714013] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3389dd62-47a3-4c52-a522-b9746b58f468 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 762.724034] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7028b825-944f-400d-805a-e1696ba482f9 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 762.740959] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c38f02d6-061d-42cd-bbc8-4dbefc63f4b2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 762.748886] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-511c1ba4-fdbc-48f5-bbfc-0375a8b31581 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 762.779803] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180925MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 762.779956] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 762.780123] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 762.856864] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b5707ff5-916e-49ce-9aac-9a08ac51bdf2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 762.856864] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance c507115c-92a0-4513-aae8-7dc8f95bc0ea actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 762.856864] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 5f4991a3-c34b-45b1-a3af-94d7d990eef1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 762.856864] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b95883b2-0366-4f52-bdf2-aa6259fafc58 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 762.857110] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 050e2b27-1311-4a9a-b5cf-6bc2f7128eba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 762.857110] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f13242a0-7e65-4d68-a317-16fb8c4b8f8a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 762.857110] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 3b95678b-dfc5-4610-a51e-2ae12fbe274b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 762.857110] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6300077d-5aa7-4794-8ba2-1ec30151c15c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 762.857263] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 4ea5887f-84bd-4629-b568-e73c78af0ad4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 762.857263] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6eef6e24-cf49-458b-ae37-8da4e02045f8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 762.881899] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 71b1ebba-2019-4378-9bd2-98a7559c22e8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 762.906983] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance e6b6cbdd-11d6-44a6-8da7-98e0f52cef67 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 762.918144] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b27ace75-e2fa-4acc-96cb-88dd49b89de5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 762.928912] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 152f9e1d-dd1b-486f-94b8-8202c0f2d335 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 762.938869] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 9d10a63c-4c97-48c3-aca8-fd317aa2fbe7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 762.949052] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance e6e6c910-9485-48b0-bffa-4534cd7f87d4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 762.958559] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 1ec954d1-1bc9-4db3-9a48-7da759cebf21 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 762.970339] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 7d51d3c0-12fd-4118-80c6-16c1cca346db has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 762.981142] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance a43cf82a-c969-47eb-b8dc-d7fe7f7870d3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 762.991387] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 40c6521b-51d9-45cf-959c-21e4f3da7eb9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 763.002482] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance dae068af-0c54-4715-bdc3-ecfd018b6294 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 763.012252] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance ee6e18cd-9af2-4440-8336-9e1858c28709 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 763.038960] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 995585f5-57a4-4ba6-9e28-18a086af264c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 763.050737] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 53abf4e7-35ea-415b-8a90-a89442c475a1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 763.061494] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 8d0a643a-96c5-4d47-aa2a-9f777e80c259 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 763.072152] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6341f8b6-9f42-4ab2-806f-dbad62de5376 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 763.082159] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance cc32959e-71ea-44cb-aebe-bf6a893ebb18 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 763.092762] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 704e4d83-19ef-493a-a374-dce0de95e975 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 763.104136] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance df9a6dc4-abb5-4855-ac4f-5479dd0b6498 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 763.115448] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f6f64438-8279-4ff4-ab80-efd1e17d7e04 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 763.115448] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 763.115448] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 763.483654] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a2ff9ba-2d11-40b4-981d-e0b6a6c9ccbd {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.492062] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37a55551-ac7d-4fce-bc36-c1049610111b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.522855] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39add64c-74c4-4143-a103-1fff2399d637 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.531867] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bc28652-a8c6-4dda-aeee-65f5e9f982e2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.547160] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 763.557771] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 763.573229] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 763.573430] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.793s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 770.882483] env[68617]: DEBUG oslo_concurrency.lockutils [None req-27c732df-2043-469d-9521-c021487abf4b tempest-InstanceActionsTestJSON-641152923 tempest-InstanceActionsTestJSON-641152923-project-member] Acquiring lock "f2bef6cc-f5e2-41a8-b377-31f016746257" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 770.882850] env[68617]: DEBUG oslo_concurrency.lockutils [None req-27c732df-2043-469d-9521-c021487abf4b tempest-InstanceActionsTestJSON-641152923 tempest-InstanceActionsTestJSON-641152923-project-member] Lock "f2bef6cc-f5e2-41a8-b377-31f016746257" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 776.169057] env[68617]: DEBUG oslo_concurrency.lockutils [None req-645ccf85-ce17-473f-8068-5b0c95db8022 tempest-ServersTestBootFromVolume-1306830610 tempest-ServersTestBootFromVolume-1306830610-project-member] Acquiring lock "fe74e8d8-e439-4834-9721-08d9e64c7740" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 776.169370] env[68617]: DEBUG oslo_concurrency.lockutils [None req-645ccf85-ce17-473f-8068-5b0c95db8022 tempest-ServersTestBootFromVolume-1306830610 tempest-ServersTestBootFromVolume-1306830610-project-member] Lock "fe74e8d8-e439-4834-9721-08d9e64c7740" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 787.159994] env[68617]: WARNING oslo_vmware.rw_handles [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 787.159994] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 787.159994] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 787.159994] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 787.159994] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 787.159994] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 787.159994] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 787.159994] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 787.159994] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 787.159994] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 787.159994] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 787.159994] env[68617]: ERROR oslo_vmware.rw_handles [ 787.159994] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/4380e46a-eb34-435c-a0de-2e6d79e694e7/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 787.160710] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 787.160710] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Copying Virtual Disk [datastore2] vmware_temp/4380e46a-eb34-435c-a0de-2e6d79e694e7/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/4380e46a-eb34-435c-a0de-2e6d79e694e7/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 787.160710] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9409ce01-37c1-4d92-b41f-4ead27d9f359 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 787.171097] env[68617]: DEBUG oslo_vmware.api [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Waiting for the task: (returnval){ [ 787.171097] env[68617]: value = "task-3470729" [ 787.171097] env[68617]: _type = "Task" [ 787.171097] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 787.179928] env[68617]: DEBUG oslo_vmware.api [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Task: {'id': task-3470729, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 787.681779] env[68617]: DEBUG oslo_vmware.exceptions [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 787.682085] env[68617]: DEBUG oslo_concurrency.lockutils [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 787.682694] env[68617]: ERROR nova.compute.manager [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 787.682694] env[68617]: Faults: ['InvalidArgument'] [ 787.682694] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Traceback (most recent call last): [ 787.682694] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 787.682694] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] yield resources [ 787.682694] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 787.682694] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] self.driver.spawn(context, instance, image_meta, [ 787.682694] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 787.682694] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 787.682694] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 787.682694] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] self._fetch_image_if_missing(context, vi) [ 787.682694] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 787.683075] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] image_cache(vi, tmp_image_ds_loc) [ 787.683075] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 787.683075] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] vm_util.copy_virtual_disk( [ 787.683075] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 787.683075] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] session._wait_for_task(vmdk_copy_task) [ 787.683075] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 787.683075] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] return self.wait_for_task(task_ref) [ 787.683075] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 787.683075] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] return evt.wait() [ 787.683075] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 787.683075] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] result = hub.switch() [ 787.683075] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 787.683075] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] return self.greenlet.switch() [ 787.683329] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 787.683329] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] self.f(*self.args, **self.kw) [ 787.683329] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 787.683329] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] raise exceptions.translate_fault(task_info.error) [ 787.683329] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 787.683329] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Faults: ['InvalidArgument'] [ 787.683329] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] [ 787.683329] env[68617]: INFO nova.compute.manager [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Terminating instance [ 787.684802] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 787.685041] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 787.685704] env[68617]: DEBUG nova.compute.manager [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 787.685889] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 787.686130] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7a0747f6-5022-453c-b9e5-fedd4cd8feaa {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 787.688815] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efaf63c7-2de4-44c4-a63e-b57d1105e297 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 787.696411] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 787.697046] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-07b534d3-3158-4f23-9656-ce270ae80632 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 787.699083] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 787.699269] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 787.700279] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d471a87e-b8e4-4fc6-8377-021b8195b020 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 787.706078] env[68617]: DEBUG oslo_vmware.api [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Waiting for the task: (returnval){ [ 787.706078] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52691a32-4e3b-2a44-7881-fa33762c394c" [ 787.706078] env[68617]: _type = "Task" [ 787.706078] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 787.713789] env[68617]: DEBUG oslo_vmware.api [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52691a32-4e3b-2a44-7881-fa33762c394c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 787.782670] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 787.782881] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 787.783071] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Deleting the datastore file [datastore2] b5707ff5-916e-49ce-9aac-9a08ac51bdf2 {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 787.783355] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-59ebf219-f5bb-459e-bb9d-7378e9b923a4 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 787.790578] env[68617]: DEBUG oslo_vmware.api [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Waiting for the task: (returnval){ [ 787.790578] env[68617]: value = "task-3470731" [ 787.790578] env[68617]: _type = "Task" [ 787.790578] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 787.799739] env[68617]: DEBUG oslo_vmware.api [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Task: {'id': task-3470731, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 788.216771] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 788.217166] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Creating directory with path [datastore2] vmware_temp/38d65540-7909-4f90-8230-5035415edb8b/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 788.217238] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-523cd75d-f881-46b7-8975-b1bda27291ed {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 788.229258] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Created directory with path [datastore2] vmware_temp/38d65540-7909-4f90-8230-5035415edb8b/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 788.229451] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Fetch image to [datastore2] vmware_temp/38d65540-7909-4f90-8230-5035415edb8b/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 788.229619] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/38d65540-7909-4f90-8230-5035415edb8b/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 788.230359] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c05afed-f517-4464-a289-2aca13386fbe {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 788.237387] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e0268e8-2227-4376-b74c-932dcf15b068 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 788.247956] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6ccfa3e-0b38-4eba-b7e0-76e040e83271 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 788.278249] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98bf419c-06a8-44b8-8e14-e185a19a44fe {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 788.284367] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-57c956e8-40ea-4d57-b3d6-93d81ac20eb7 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 788.299526] env[68617]: DEBUG oslo_vmware.api [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Task: {'id': task-3470731, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072894} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 788.299774] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 788.299989] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 788.300218] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 788.300399] env[68617]: INFO nova.compute.manager [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Took 0.61 seconds to destroy the instance on the hypervisor. [ 788.302521] env[68617]: DEBUG nova.compute.claims [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 788.302702] env[68617]: DEBUG oslo_concurrency.lockutils [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 788.302920] env[68617]: DEBUG oslo_concurrency.lockutils [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 788.307311] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 788.374784] env[68617]: DEBUG oslo_vmware.rw_handles [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/38d65540-7909-4f90-8230-5035415edb8b/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 788.437208] env[68617]: DEBUG oslo_vmware.rw_handles [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 788.437399] env[68617]: DEBUG oslo_vmware.rw_handles [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/38d65540-7909-4f90-8230-5035415edb8b/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 788.797214] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ab9073f-757b-4af6-8d9d-b7214451ebb4 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 788.806130] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42150226-3807-4d61-84e4-abd3dd2de0b5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 788.838016] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da613b9f-315e-4fc6-a431-5a39bb46b796 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 788.845811] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3cace38d-ed2a-45d3-b284-6e572267a883 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 788.862426] env[68617]: DEBUG nova.compute.provider_tree [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 788.871839] env[68617]: DEBUG nova.scheduler.client.report [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 788.887906] env[68617]: DEBUG oslo_concurrency.lockutils [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.585s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 788.888457] env[68617]: ERROR nova.compute.manager [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 788.888457] env[68617]: Faults: ['InvalidArgument'] [ 788.888457] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Traceback (most recent call last): [ 788.888457] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 788.888457] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] self.driver.spawn(context, instance, image_meta, [ 788.888457] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 788.888457] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 788.888457] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 788.888457] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] self._fetch_image_if_missing(context, vi) [ 788.888457] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 788.888457] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] image_cache(vi, tmp_image_ds_loc) [ 788.888457] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 788.888806] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] vm_util.copy_virtual_disk( [ 788.888806] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 788.888806] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] session._wait_for_task(vmdk_copy_task) [ 788.888806] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 788.888806] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] return self.wait_for_task(task_ref) [ 788.888806] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 788.888806] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] return evt.wait() [ 788.888806] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 788.888806] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] result = hub.switch() [ 788.888806] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 788.888806] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] return self.greenlet.switch() [ 788.888806] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 788.888806] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] self.f(*self.args, **self.kw) [ 788.889064] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 788.889064] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] raise exceptions.translate_fault(task_info.error) [ 788.889064] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 788.889064] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Faults: ['InvalidArgument'] [ 788.889064] env[68617]: ERROR nova.compute.manager [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] [ 788.889167] env[68617]: DEBUG nova.compute.utils [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 788.893835] env[68617]: DEBUG nova.compute.manager [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Build of instance b5707ff5-916e-49ce-9aac-9a08ac51bdf2 was re-scheduled: A specified parameter was not correct: fileType [ 788.893835] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 788.894173] env[68617]: DEBUG nova.compute.manager [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 788.894348] env[68617]: DEBUG nova.compute.manager [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 788.894516] env[68617]: DEBUG nova.compute.manager [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 788.894680] env[68617]: DEBUG nova.network.neutron [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 789.272816] env[68617]: DEBUG nova.network.neutron [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 789.283332] env[68617]: INFO nova.compute.manager [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] Took 0.39 seconds to deallocate network for instance. [ 789.405879] env[68617]: INFO nova.scheduler.client.report [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Deleted allocations for instance b5707ff5-916e-49ce-9aac-9a08ac51bdf2 [ 789.429032] env[68617]: DEBUG oslo_concurrency.lockutils [None req-66175842-a1fc-456f-864c-ceb774abf015 tempest-ServerTagsTestJSON-1559950575 tempest-ServerTagsTestJSON-1559950575-project-member] Lock "b5707ff5-916e-49ce-9aac-9a08ac51bdf2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 101.681s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 789.430061] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "b5707ff5-916e-49ce-9aac-9a08ac51bdf2" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 100.781s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 789.430258] env[68617]: INFO nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: b5707ff5-916e-49ce-9aac-9a08ac51bdf2] During sync_power_state the instance has a pending task (spawning). Skip. [ 789.430430] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "b5707ff5-916e-49ce-9aac-9a08ac51bdf2" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 789.441208] env[68617]: DEBUG nova.compute.manager [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 789.488935] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 789.489256] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 789.490942] env[68617]: INFO nova.compute.claims [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 789.905108] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bc6fb1a-aaf7-4203-8e32-7666b1cc0ece {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 789.913482] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9ccf551-8336-4fb4-b4f5-1fc5192272e9 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 789.948488] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45c639f4-d811-4f30-a90f-07665a6c8470 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 789.956538] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4dfda6eb-cca8-4b25-94ae-95767c08d1f4 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 789.969996] env[68617]: DEBUG nova.compute.provider_tree [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 789.978234] env[68617]: DEBUG nova.scheduler.client.report [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 789.992529] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.503s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 789.993030] env[68617]: DEBUG nova.compute.manager [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 790.030193] env[68617]: DEBUG nova.compute.utils [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 790.031529] env[68617]: DEBUG nova.compute.manager [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 790.031644] env[68617]: DEBUG nova.network.neutron [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 790.042986] env[68617]: DEBUG nova.compute.manager [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 790.101994] env[68617]: DEBUG nova.policy [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2f88c468845b4b91b928bcad4ddcabd4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ec839b3b8dc74ab4aaf0c9fff8794afe', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 790.111810] env[68617]: DEBUG nova.compute.manager [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 790.138139] env[68617]: DEBUG nova.virt.hardware [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 790.138389] env[68617]: DEBUG nova.virt.hardware [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 790.138544] env[68617]: DEBUG nova.virt.hardware [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 790.138725] env[68617]: DEBUG nova.virt.hardware [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 790.138870] env[68617]: DEBUG nova.virt.hardware [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 790.139115] env[68617]: DEBUG nova.virt.hardware [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 790.139368] env[68617]: DEBUG nova.virt.hardware [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 790.139529] env[68617]: DEBUG nova.virt.hardware [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 790.139699] env[68617]: DEBUG nova.virt.hardware [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 790.139863] env[68617]: DEBUG nova.virt.hardware [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 790.140057] env[68617]: DEBUG nova.virt.hardware [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 790.140905] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-096e0694-9177-4e08-ac51-e61ec2dbc2e4 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 790.149661] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f374d3a-afa9-4702-8137-04c585a1e362 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 790.422056] env[68617]: DEBUG nova.network.neutron [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Successfully created port: feff7c41-aa8f-4653-bf76-f8bcd42bea65 {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 790.994909] env[68617]: DEBUG nova.network.neutron [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Successfully updated port: feff7c41-aa8f-4653-bf76-f8bcd42bea65 {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 791.008357] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Acquiring lock "refresh_cache-71b1ebba-2019-4378-9bd2-98a7559c22e8" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 791.008583] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Acquired lock "refresh_cache-71b1ebba-2019-4378-9bd2-98a7559c22e8" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 791.008801] env[68617]: DEBUG nova.network.neutron [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 791.056119] env[68617]: DEBUG nova.network.neutron [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 791.215332] env[68617]: DEBUG nova.network.neutron [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Updating instance_info_cache with network_info: [{"id": "feff7c41-aa8f-4653-bf76-f8bcd42bea65", "address": "fa:16:3e:cb:b9:d8", "network": {"id": "11f566b0-645d-4983-bcf9-21fd2838775d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1026017301-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ec839b3b8dc74ab4aaf0c9fff8794afe", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "74f30339-6421-4654-bddb-81d7f34db9d7", "external-id": "nsx-vlan-transportzone-899", "segmentation_id": 899, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfeff7c41-aa", "ovs_interfaceid": "feff7c41-aa8f-4653-bf76-f8bcd42bea65", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 791.227240] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Releasing lock "refresh_cache-71b1ebba-2019-4378-9bd2-98a7559c22e8" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 791.227546] env[68617]: DEBUG nova.compute.manager [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Instance network_info: |[{"id": "feff7c41-aa8f-4653-bf76-f8bcd42bea65", "address": "fa:16:3e:cb:b9:d8", "network": {"id": "11f566b0-645d-4983-bcf9-21fd2838775d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1026017301-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ec839b3b8dc74ab4aaf0c9fff8794afe", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "74f30339-6421-4654-bddb-81d7f34db9d7", "external-id": "nsx-vlan-transportzone-899", "segmentation_id": 899, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfeff7c41-aa", "ovs_interfaceid": "feff7c41-aa8f-4653-bf76-f8bcd42bea65", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 791.227925] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:cb:b9:d8', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '74f30339-6421-4654-bddb-81d7f34db9d7', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'feff7c41-aa8f-4653-bf76-f8bcd42bea65', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 791.235951] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Creating folder: Project (ec839b3b8dc74ab4aaf0c9fff8794afe). Parent ref: group-v693691. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 791.236536] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5766b339-31e2-4a04-a706-10015c9e8ec1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 791.247649] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Created folder: Project (ec839b3b8dc74ab4aaf0c9fff8794afe) in parent group-v693691. [ 791.247888] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Creating folder: Instances. Parent ref: group-v693727. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 791.248146] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-45c2db70-4157-4876-a608-16c0c3b94616 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 791.257801] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Created folder: Instances in parent group-v693727. [ 791.258118] env[68617]: DEBUG oslo.service.loopingcall [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 791.258345] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 791.258573] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6a411e63-8ff8-4976-8afa-e7d5fe0d8c83 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 791.280910] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 791.280910] env[68617]: value = "task-3470734" [ 791.280910] env[68617]: _type = "Task" [ 791.280910] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 791.295661] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470734, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 791.664029] env[68617]: DEBUG nova.compute.manager [req-2ddace6a-f5f4-470f-bc11-e5445a912d73 req-8d531d10-d92f-4583-aa86-86ad91ae8456 service nova] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Received event network-vif-plugged-feff7c41-aa8f-4653-bf76-f8bcd42bea65 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 791.664544] env[68617]: DEBUG oslo_concurrency.lockutils [req-2ddace6a-f5f4-470f-bc11-e5445a912d73 req-8d531d10-d92f-4583-aa86-86ad91ae8456 service nova] Acquiring lock "71b1ebba-2019-4378-9bd2-98a7559c22e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 791.664767] env[68617]: DEBUG oslo_concurrency.lockutils [req-2ddace6a-f5f4-470f-bc11-e5445a912d73 req-8d531d10-d92f-4583-aa86-86ad91ae8456 service nova] Lock "71b1ebba-2019-4378-9bd2-98a7559c22e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 791.664938] env[68617]: DEBUG oslo_concurrency.lockutils [req-2ddace6a-f5f4-470f-bc11-e5445a912d73 req-8d531d10-d92f-4583-aa86-86ad91ae8456 service nova] Lock "71b1ebba-2019-4378-9bd2-98a7559c22e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 791.666487] env[68617]: DEBUG nova.compute.manager [req-2ddace6a-f5f4-470f-bc11-e5445a912d73 req-8d531d10-d92f-4583-aa86-86ad91ae8456 service nova] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] No waiting events found dispatching network-vif-plugged-feff7c41-aa8f-4653-bf76-f8bcd42bea65 {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 791.666715] env[68617]: WARNING nova.compute.manager [req-2ddace6a-f5f4-470f-bc11-e5445a912d73 req-8d531d10-d92f-4583-aa86-86ad91ae8456 service nova] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Received unexpected event network-vif-plugged-feff7c41-aa8f-4653-bf76-f8bcd42bea65 for instance with vm_state building and task_state spawning. [ 791.666885] env[68617]: DEBUG nova.compute.manager [req-2ddace6a-f5f4-470f-bc11-e5445a912d73 req-8d531d10-d92f-4583-aa86-86ad91ae8456 service nova] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Received event network-changed-feff7c41-aa8f-4653-bf76-f8bcd42bea65 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 791.667056] env[68617]: DEBUG nova.compute.manager [req-2ddace6a-f5f4-470f-bc11-e5445a912d73 req-8d531d10-d92f-4583-aa86-86ad91ae8456 service nova] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Refreshing instance network info cache due to event network-changed-feff7c41-aa8f-4653-bf76-f8bcd42bea65. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 791.667248] env[68617]: DEBUG oslo_concurrency.lockutils [req-2ddace6a-f5f4-470f-bc11-e5445a912d73 req-8d531d10-d92f-4583-aa86-86ad91ae8456 service nova] Acquiring lock "refresh_cache-71b1ebba-2019-4378-9bd2-98a7559c22e8" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 791.667431] env[68617]: DEBUG oslo_concurrency.lockutils [req-2ddace6a-f5f4-470f-bc11-e5445a912d73 req-8d531d10-d92f-4583-aa86-86ad91ae8456 service nova] Acquired lock "refresh_cache-71b1ebba-2019-4378-9bd2-98a7559c22e8" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 791.667542] env[68617]: DEBUG nova.network.neutron [req-2ddace6a-f5f4-470f-bc11-e5445a912d73 req-8d531d10-d92f-4583-aa86-86ad91ae8456 service nova] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Refreshing network info cache for port feff7c41-aa8f-4653-bf76-f8bcd42bea65 {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 791.791392] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470734, 'name': CreateVM_Task, 'duration_secs': 0.304265} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 791.791728] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 791.792709] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 791.792878] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 791.793215] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 791.793503] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7ef16c62-9b88-46d9-9360-670af619e18a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 791.799261] env[68617]: DEBUG oslo_vmware.api [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Waiting for the task: (returnval){ [ 791.799261] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]529a71a1-93a1-8568-c98c-e10cac40a8a1" [ 791.799261] env[68617]: _type = "Task" [ 791.799261] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 791.808524] env[68617]: DEBUG oslo_vmware.api [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]529a71a1-93a1-8568-c98c-e10cac40a8a1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 791.951476] env[68617]: DEBUG nova.network.neutron [req-2ddace6a-f5f4-470f-bc11-e5445a912d73 req-8d531d10-d92f-4583-aa86-86ad91ae8456 service nova] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Updated VIF entry in instance network info cache for port feff7c41-aa8f-4653-bf76-f8bcd42bea65. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 791.951825] env[68617]: DEBUG nova.network.neutron [req-2ddace6a-f5f4-470f-bc11-e5445a912d73 req-8d531d10-d92f-4583-aa86-86ad91ae8456 service nova] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Updating instance_info_cache with network_info: [{"id": "feff7c41-aa8f-4653-bf76-f8bcd42bea65", "address": "fa:16:3e:cb:b9:d8", "network": {"id": "11f566b0-645d-4983-bcf9-21fd2838775d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1026017301-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ec839b3b8dc74ab4aaf0c9fff8794afe", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "74f30339-6421-4654-bddb-81d7f34db9d7", "external-id": "nsx-vlan-transportzone-899", "segmentation_id": 899, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfeff7c41-aa", "ovs_interfaceid": "feff7c41-aa8f-4653-bf76-f8bcd42bea65", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 791.963289] env[68617]: DEBUG oslo_concurrency.lockutils [req-2ddace6a-f5f4-470f-bc11-e5445a912d73 req-8d531d10-d92f-4583-aa86-86ad91ae8456 service nova] Releasing lock "refresh_cache-71b1ebba-2019-4378-9bd2-98a7559c22e8" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 792.314869] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 792.315219] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 792.315361] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 794.052390] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Acquiring lock "82864ac3-a199-478c-8c57-97ea0a256201" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 794.052674] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Lock "82864ac3-a199-478c-8c57-97ea0a256201" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 821.574120] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 821.574385] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Starting heal instance info cache {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 821.574420] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Rebuilding the list of instances to heal {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 821.594399] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 821.594561] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 821.594695] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 821.594822] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 821.594947] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 821.595164] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 821.595311] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 821.595437] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 821.595551] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 821.595715] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 821.595842] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Didn't find any instances for network info cache update. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 821.698858] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 821.699106] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 822.694091] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 822.698693] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 822.698884] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 822.710098] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 822.710317] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 822.710489] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 822.710637] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 822.711718] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c842b017-2f14-4a5e-a7e6-8a4fa183f978 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 822.720448] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ed97d88-d001-4097-bb66-4e442f0e1eb3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 822.735507] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b59bfcf2-7205-4985-9fc8-c0c8284e21df {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 822.741928] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9863b222-4f35-4c9f-a5a4-57eec0aff3bb {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 822.770513] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180934MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 822.770670] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 822.770859] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 822.843142] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance c507115c-92a0-4513-aae8-7dc8f95bc0ea actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 822.843318] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 5f4991a3-c34b-45b1-a3af-94d7d990eef1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 822.843447] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b95883b2-0366-4f52-bdf2-aa6259fafc58 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 822.843572] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 050e2b27-1311-4a9a-b5cf-6bc2f7128eba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 822.843731] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f13242a0-7e65-4d68-a317-16fb8c4b8f8a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 822.843864] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 3b95678b-dfc5-4610-a51e-2ae12fbe274b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 822.843989] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6300077d-5aa7-4794-8ba2-1ec30151c15c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 822.844154] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 4ea5887f-84bd-4629-b568-e73c78af0ad4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 822.844275] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6eef6e24-cf49-458b-ae37-8da4e02045f8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 822.844390] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 71b1ebba-2019-4378-9bd2-98a7559c22e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 822.856666] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance e6b6cbdd-11d6-44a6-8da7-98e0f52cef67 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 822.867387] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b27ace75-e2fa-4acc-96cb-88dd49b89de5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 822.878388] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 152f9e1d-dd1b-486f-94b8-8202c0f2d335 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 822.888741] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 9d10a63c-4c97-48c3-aca8-fd317aa2fbe7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 822.899057] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance e6e6c910-9485-48b0-bffa-4534cd7f87d4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 822.909033] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 1ec954d1-1bc9-4db3-9a48-7da759cebf21 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 822.919692] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 7d51d3c0-12fd-4118-80c6-16c1cca346db has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 822.929296] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance a43cf82a-c969-47eb-b8dc-d7fe7f7870d3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 822.938617] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 40c6521b-51d9-45cf-959c-21e4f3da7eb9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 822.947747] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance dae068af-0c54-4715-bdc3-ecfd018b6294 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 822.956558] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance ee6e18cd-9af2-4440-8336-9e1858c28709 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 822.966624] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 995585f5-57a4-4ba6-9e28-18a086af264c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 822.976030] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 53abf4e7-35ea-415b-8a90-a89442c475a1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 822.985310] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 8d0a643a-96c5-4d47-aa2a-9f777e80c259 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 822.996082] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6341f8b6-9f42-4ab2-806f-dbad62de5376 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 823.005264] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance cc32959e-71ea-44cb-aebe-bf6a893ebb18 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 823.014566] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 704e4d83-19ef-493a-a374-dce0de95e975 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 823.025045] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance df9a6dc4-abb5-4855-ac4f-5479dd0b6498 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 823.033127] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f6f64438-8279-4ff4-ab80-efd1e17d7e04 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 823.043159] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f2bef6cc-f5e2-41a8-b377-31f016746257 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 823.051775] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance fe74e8d8-e439-4834-9721-08d9e64c7740 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 823.062148] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 82864ac3-a199-478c-8c57-97ea0a256201 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 823.062468] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 823.062674] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 823.425170] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7948aa9-4048-4fea-bff0-f6d121cdede5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 823.434191] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f260c2d-6af6-4f3c-91d9-21a54656c64b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 823.464542] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69209c25-c1f4-4bb7-b9ae-778b535a92f9 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 823.471591] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6fa914b-292c-4df4-9ebc-20a80a755baa {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 823.484069] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 823.493403] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 823.506641] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 823.506854] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.736s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 824.507677] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 824.507997] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 824.508108] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68617) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 824.699555] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 837.177203] env[68617]: WARNING oslo_vmware.rw_handles [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 837.177203] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 837.177203] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 837.177203] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 837.177203] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 837.177203] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 837.177203] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 837.177203] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 837.177203] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 837.177203] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 837.177203] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 837.177203] env[68617]: ERROR oslo_vmware.rw_handles [ 837.177761] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/38d65540-7909-4f90-8230-5035415edb8b/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 837.179487] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 837.179795] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Copying Virtual Disk [datastore2] vmware_temp/38d65540-7909-4f90-8230-5035415edb8b/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/38d65540-7909-4f90-8230-5035415edb8b/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 837.180184] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-fd591e4a-1f59-4567-af97-96df847ffacb {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 837.188011] env[68617]: DEBUG oslo_vmware.api [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Waiting for the task: (returnval){ [ 837.188011] env[68617]: value = "task-3470735" [ 837.188011] env[68617]: _type = "Task" [ 837.188011] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 837.195936] env[68617]: DEBUG oslo_vmware.api [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Task: {'id': task-3470735, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 837.699232] env[68617]: DEBUG oslo_vmware.exceptions [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 837.699528] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 837.700086] env[68617]: ERROR nova.compute.manager [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 837.700086] env[68617]: Faults: ['InvalidArgument'] [ 837.700086] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Traceback (most recent call last): [ 837.700086] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 837.700086] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] yield resources [ 837.700086] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 837.700086] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] self.driver.spawn(context, instance, image_meta, [ 837.700086] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 837.700086] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] self._vmops.spawn(context, instance, image_meta, injected_files, [ 837.700086] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 837.700086] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] self._fetch_image_if_missing(context, vi) [ 837.700086] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 837.700445] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] image_cache(vi, tmp_image_ds_loc) [ 837.700445] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 837.700445] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] vm_util.copy_virtual_disk( [ 837.700445] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 837.700445] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] session._wait_for_task(vmdk_copy_task) [ 837.700445] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 837.700445] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] return self.wait_for_task(task_ref) [ 837.700445] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 837.700445] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] return evt.wait() [ 837.700445] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 837.700445] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] result = hub.switch() [ 837.700445] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 837.700445] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] return self.greenlet.switch() [ 837.700804] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 837.700804] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] self.f(*self.args, **self.kw) [ 837.700804] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 837.700804] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] raise exceptions.translate_fault(task_info.error) [ 837.700804] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 837.700804] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Faults: ['InvalidArgument'] [ 837.700804] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] [ 837.700804] env[68617]: INFO nova.compute.manager [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Terminating instance [ 837.701892] env[68617]: DEBUG oslo_concurrency.lockutils [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 837.702109] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 837.702347] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-67a98cf2-fb45-4954-ae92-0f87fa30b1d2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 837.704715] env[68617]: DEBUG nova.compute.manager [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 837.704932] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 837.705692] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7603054f-823b-4c31-ab17-ce3573a23930 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 837.712198] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 837.712411] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-71a2862c-2017-4e32-bae1-4c1699b970bf {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 837.714616] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 837.714784] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 837.715733] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-aaa36eef-8e29-453e-8af8-2f228a92cf1e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 837.720479] env[68617]: DEBUG oslo_vmware.api [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Waiting for the task: (returnval){ [ 837.720479] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52b9a386-43a2-f2a6-cd5e-c5cbe4476f12" [ 837.720479] env[68617]: _type = "Task" [ 837.720479] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 837.733265] env[68617]: DEBUG oslo_vmware.api [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52b9a386-43a2-f2a6-cd5e-c5cbe4476f12, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 837.783014] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 837.783267] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 837.783447] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Deleting the datastore file [datastore2] c507115c-92a0-4513-aae8-7dc8f95bc0ea {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 837.783837] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7a7a9e72-a818-4cf2-8fef-8f84ed80edca {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 837.790403] env[68617]: DEBUG oslo_vmware.api [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Waiting for the task: (returnval){ [ 837.790403] env[68617]: value = "task-3470737" [ 837.790403] env[68617]: _type = "Task" [ 837.790403] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 837.798560] env[68617]: DEBUG oslo_vmware.api [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Task: {'id': task-3470737, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 838.231420] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 838.231700] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Creating directory with path [datastore2] vmware_temp/60e525eb-dd30-4755-8d91-8c40c91cefe1/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 838.231936] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cfdf72ee-d9c2-4807-988c-23119eed4a44 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 838.243066] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Created directory with path [datastore2] vmware_temp/60e525eb-dd30-4755-8d91-8c40c91cefe1/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 838.243066] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Fetch image to [datastore2] vmware_temp/60e525eb-dd30-4755-8d91-8c40c91cefe1/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 838.243250] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/60e525eb-dd30-4755-8d91-8c40c91cefe1/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 838.243930] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e24df73a-0db2-4c48-9639-a61587a4ba3d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 838.250494] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5c47695-c3ea-471f-b964-fba8eccb8465 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 838.259198] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07350cac-4c66-4f94-9b8b-fce65a1a7c81 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 838.290476] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bc4ee67-5b75-44ae-abf1-1d13c29a1330 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 838.301584] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-903a0a88-2ccb-4c84-9d53-7db9a4cc3166 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 838.303302] env[68617]: DEBUG oslo_vmware.api [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Task: {'id': task-3470737, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07715} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 838.303545] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 838.303717] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 838.303881] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 838.304058] env[68617]: INFO nova.compute.manager [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Took 0.60 seconds to destroy the instance on the hypervisor. [ 838.306134] env[68617]: DEBUG nova.compute.claims [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 838.306303] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 838.306506] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 838.335596] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 838.392144] env[68617]: DEBUG oslo_vmware.rw_handles [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/60e525eb-dd30-4755-8d91-8c40c91cefe1/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 838.451125] env[68617]: DEBUG oslo_vmware.rw_handles [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 838.451322] env[68617]: DEBUG oslo_vmware.rw_handles [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/60e525eb-dd30-4755-8d91-8c40c91cefe1/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 838.767960] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e58f8135-badc-44f8-ae3e-95fd1b4f3b5c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 838.775128] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d46f4036-f35c-46e5-a869-66ae86590ab5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 838.806576] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20f457cd-0958-4283-ae1c-a7ceaa9f8fa0 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 838.811922] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a2c5883-0553-4733-83e5-6c8bfd12fff6 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 838.824923] env[68617]: DEBUG nova.compute.provider_tree [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 838.833515] env[68617]: DEBUG nova.scheduler.client.report [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 838.848201] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.542s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 838.848728] env[68617]: ERROR nova.compute.manager [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 838.848728] env[68617]: Faults: ['InvalidArgument'] [ 838.848728] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Traceback (most recent call last): [ 838.848728] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 838.848728] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] self.driver.spawn(context, instance, image_meta, [ 838.848728] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 838.848728] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] self._vmops.spawn(context, instance, image_meta, injected_files, [ 838.848728] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 838.848728] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] self._fetch_image_if_missing(context, vi) [ 838.848728] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 838.848728] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] image_cache(vi, tmp_image_ds_loc) [ 838.848728] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 838.849068] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] vm_util.copy_virtual_disk( [ 838.849068] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 838.849068] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] session._wait_for_task(vmdk_copy_task) [ 838.849068] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 838.849068] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] return self.wait_for_task(task_ref) [ 838.849068] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 838.849068] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] return evt.wait() [ 838.849068] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 838.849068] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] result = hub.switch() [ 838.849068] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 838.849068] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] return self.greenlet.switch() [ 838.849068] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 838.849068] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] self.f(*self.args, **self.kw) [ 838.849402] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 838.849402] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] raise exceptions.translate_fault(task_info.error) [ 838.849402] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 838.849402] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Faults: ['InvalidArgument'] [ 838.849402] env[68617]: ERROR nova.compute.manager [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] [ 838.849524] env[68617]: DEBUG nova.compute.utils [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 838.850796] env[68617]: DEBUG nova.compute.manager [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Build of instance c507115c-92a0-4513-aae8-7dc8f95bc0ea was re-scheduled: A specified parameter was not correct: fileType [ 838.850796] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 838.851183] env[68617]: DEBUG nova.compute.manager [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 838.851353] env[68617]: DEBUG nova.compute.manager [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 838.851506] env[68617]: DEBUG nova.compute.manager [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 838.851665] env[68617]: DEBUG nova.network.neutron [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 839.173021] env[68617]: DEBUG nova.network.neutron [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 839.188345] env[68617]: INFO nova.compute.manager [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] [instance: c507115c-92a0-4513-aae8-7dc8f95bc0ea] Took 0.34 seconds to deallocate network for instance. [ 839.296015] env[68617]: INFO nova.scheduler.client.report [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Deleted allocations for instance c507115c-92a0-4513-aae8-7dc8f95bc0ea [ 839.317028] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ba331776-b5a4-4c82-9966-d1b599e7894d tempest-ServerDiagnosticsNegativeTest-37496833 tempest-ServerDiagnosticsNegativeTest-37496833-project-member] Lock "c507115c-92a0-4513-aae8-7dc8f95bc0ea" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 149.329s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 839.342491] env[68617]: DEBUG nova.compute.manager [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 839.391537] env[68617]: DEBUG oslo_concurrency.lockutils [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 839.391789] env[68617]: DEBUG oslo_concurrency.lockutils [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 839.393285] env[68617]: INFO nova.compute.claims [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 839.806085] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22cedec5-e789-4b88-8fba-06cc2c94f121 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 839.814103] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09a9ce8b-104b-44be-8559-66c60accd31d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 839.846067] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7230981d-2f71-4017-8530-75c41761b1bb {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 839.853959] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-954d9170-0d49-4ac0-bc23-8b6619ad5003 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 839.869369] env[68617]: DEBUG nova.compute.provider_tree [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 839.878609] env[68617]: DEBUG nova.scheduler.client.report [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 839.894358] env[68617]: DEBUG oslo_concurrency.lockutils [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.502s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 839.894834] env[68617]: DEBUG nova.compute.manager [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 839.928886] env[68617]: DEBUG nova.compute.utils [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 839.930485] env[68617]: DEBUG nova.compute.manager [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 839.930652] env[68617]: DEBUG nova.network.neutron [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 839.943831] env[68617]: DEBUG nova.compute.manager [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 840.011810] env[68617]: DEBUG nova.policy [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6cf7d315d274463086d07fed7bafe1a6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f090bdd2a5d643b99d3a13a492697e75', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 840.014820] env[68617]: DEBUG nova.compute.manager [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 840.040254] env[68617]: DEBUG nova.virt.hardware [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 840.040500] env[68617]: DEBUG nova.virt.hardware [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 840.040653] env[68617]: DEBUG nova.virt.hardware [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 840.040828] env[68617]: DEBUG nova.virt.hardware [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 840.040975] env[68617]: DEBUG nova.virt.hardware [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 840.041134] env[68617]: DEBUG nova.virt.hardware [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 840.041342] env[68617]: DEBUG nova.virt.hardware [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 840.041494] env[68617]: DEBUG nova.virt.hardware [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 840.041656] env[68617]: DEBUG nova.virt.hardware [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 840.041814] env[68617]: DEBUG nova.virt.hardware [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 840.042169] env[68617]: DEBUG nova.virt.hardware [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 840.042836] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e80f604c-3a8c-4930-aea6-b5e13ab93ef2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 840.050738] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-686f3730-e7d1-448f-a248-793f56c1b7bb {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 840.391462] env[68617]: DEBUG nova.network.neutron [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Successfully created port: 00ed81c9-ecdf-45fa-adc3-b359f43f032b {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 840.983179] env[68617]: DEBUG nova.network.neutron [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Successfully updated port: 00ed81c9-ecdf-45fa-adc3-b359f43f032b {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 840.997471] env[68617]: DEBUG oslo_concurrency.lockutils [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Acquiring lock "refresh_cache-e6b6cbdd-11d6-44a6-8da7-98e0f52cef67" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 840.997816] env[68617]: DEBUG oslo_concurrency.lockutils [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Acquired lock "refresh_cache-e6b6cbdd-11d6-44a6-8da7-98e0f52cef67" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 840.997816] env[68617]: DEBUG nova.network.neutron [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 841.040654] env[68617]: DEBUG nova.network.neutron [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 841.205009] env[68617]: DEBUG nova.network.neutron [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Updating instance_info_cache with network_info: [{"id": "00ed81c9-ecdf-45fa-adc3-b359f43f032b", "address": "fa:16:3e:0a:c9:5a", "network": {"id": "e3aee9db-8596-4ea8-943e-5c365382ee22", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.110", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "f1a3ab6230dd468b8019424ce71de8ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cde23701-02ca-4cb4-b5a6-d321f8ac9660", "external-id": "nsx-vlan-transportzone-586", "segmentation_id": 586, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap00ed81c9-ec", "ovs_interfaceid": "00ed81c9-ecdf-45fa-adc3-b359f43f032b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 841.217611] env[68617]: DEBUG oslo_concurrency.lockutils [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Releasing lock "refresh_cache-e6b6cbdd-11d6-44a6-8da7-98e0f52cef67" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 841.217953] env[68617]: DEBUG nova.compute.manager [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Instance network_info: |[{"id": "00ed81c9-ecdf-45fa-adc3-b359f43f032b", "address": "fa:16:3e:0a:c9:5a", "network": {"id": "e3aee9db-8596-4ea8-943e-5c365382ee22", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.110", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "f1a3ab6230dd468b8019424ce71de8ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cde23701-02ca-4cb4-b5a6-d321f8ac9660", "external-id": "nsx-vlan-transportzone-586", "segmentation_id": 586, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap00ed81c9-ec", "ovs_interfaceid": "00ed81c9-ecdf-45fa-adc3-b359f43f032b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 841.218364] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:0a:c9:5a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'cde23701-02ca-4cb4-b5a6-d321f8ac9660', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '00ed81c9-ecdf-45fa-adc3-b359f43f032b', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 841.226188] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Creating folder: Project (f090bdd2a5d643b99d3a13a492697e75). Parent ref: group-v693691. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 841.226743] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-568f7037-5899-4893-b980-2f2e80a337d6 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 841.237798] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Created folder: Project (f090bdd2a5d643b99d3a13a492697e75) in parent group-v693691. [ 841.237798] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Creating folder: Instances. Parent ref: group-v693730. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 841.237798] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-28ca426d-2aa9-4039-a220-59fb0d74af7e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 841.245829] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Created folder: Instances in parent group-v693730. [ 841.246070] env[68617]: DEBUG oslo.service.loopingcall [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 841.246270] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 841.246489] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-7288df74-3901-4853-9b8f-b0945adf5672 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 841.265258] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 841.265258] env[68617]: value = "task-3470740" [ 841.265258] env[68617]: _type = "Task" [ 841.265258] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 841.272364] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470740, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 841.492858] env[68617]: DEBUG nova.compute.manager [req-4b5b9500-5031-4c8f-bf1c-1281b212f90a req-c52ee9ce-0766-4bd8-9724-2fe9dedc39db service nova] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Received event network-vif-plugged-00ed81c9-ecdf-45fa-adc3-b359f43f032b {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 841.493446] env[68617]: DEBUG oslo_concurrency.lockutils [req-4b5b9500-5031-4c8f-bf1c-1281b212f90a req-c52ee9ce-0766-4bd8-9724-2fe9dedc39db service nova] Acquiring lock "e6b6cbdd-11d6-44a6-8da7-98e0f52cef67-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 841.493738] env[68617]: DEBUG oslo_concurrency.lockutils [req-4b5b9500-5031-4c8f-bf1c-1281b212f90a req-c52ee9ce-0766-4bd8-9724-2fe9dedc39db service nova] Lock "e6b6cbdd-11d6-44a6-8da7-98e0f52cef67-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 841.493926] env[68617]: DEBUG oslo_concurrency.lockutils [req-4b5b9500-5031-4c8f-bf1c-1281b212f90a req-c52ee9ce-0766-4bd8-9724-2fe9dedc39db service nova] Lock "e6b6cbdd-11d6-44a6-8da7-98e0f52cef67-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 841.494107] env[68617]: DEBUG nova.compute.manager [req-4b5b9500-5031-4c8f-bf1c-1281b212f90a req-c52ee9ce-0766-4bd8-9724-2fe9dedc39db service nova] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] No waiting events found dispatching network-vif-plugged-00ed81c9-ecdf-45fa-adc3-b359f43f032b {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 841.494275] env[68617]: WARNING nova.compute.manager [req-4b5b9500-5031-4c8f-bf1c-1281b212f90a req-c52ee9ce-0766-4bd8-9724-2fe9dedc39db service nova] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Received unexpected event network-vif-plugged-00ed81c9-ecdf-45fa-adc3-b359f43f032b for instance with vm_state building and task_state spawning. [ 841.494444] env[68617]: DEBUG nova.compute.manager [req-4b5b9500-5031-4c8f-bf1c-1281b212f90a req-c52ee9ce-0766-4bd8-9724-2fe9dedc39db service nova] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Received event network-changed-00ed81c9-ecdf-45fa-adc3-b359f43f032b {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 841.494602] env[68617]: DEBUG nova.compute.manager [req-4b5b9500-5031-4c8f-bf1c-1281b212f90a req-c52ee9ce-0766-4bd8-9724-2fe9dedc39db service nova] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Refreshing instance network info cache due to event network-changed-00ed81c9-ecdf-45fa-adc3-b359f43f032b. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 841.494789] env[68617]: DEBUG oslo_concurrency.lockutils [req-4b5b9500-5031-4c8f-bf1c-1281b212f90a req-c52ee9ce-0766-4bd8-9724-2fe9dedc39db service nova] Acquiring lock "refresh_cache-e6b6cbdd-11d6-44a6-8da7-98e0f52cef67" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 841.494925] env[68617]: DEBUG oslo_concurrency.lockutils [req-4b5b9500-5031-4c8f-bf1c-1281b212f90a req-c52ee9ce-0766-4bd8-9724-2fe9dedc39db service nova] Acquired lock "refresh_cache-e6b6cbdd-11d6-44a6-8da7-98e0f52cef67" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 841.495093] env[68617]: DEBUG nova.network.neutron [req-4b5b9500-5031-4c8f-bf1c-1281b212f90a req-c52ee9ce-0766-4bd8-9724-2fe9dedc39db service nova] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Refreshing network info cache for port 00ed81c9-ecdf-45fa-adc3-b359f43f032b {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 841.777137] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470740, 'name': CreateVM_Task, 'duration_secs': 0.291714} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 841.777963] env[68617]: DEBUG nova.network.neutron [req-4b5b9500-5031-4c8f-bf1c-1281b212f90a req-c52ee9ce-0766-4bd8-9724-2fe9dedc39db service nova] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Updated VIF entry in instance network info cache for port 00ed81c9-ecdf-45fa-adc3-b359f43f032b. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 841.778307] env[68617]: DEBUG nova.network.neutron [req-4b5b9500-5031-4c8f-bf1c-1281b212f90a req-c52ee9ce-0766-4bd8-9724-2fe9dedc39db service nova] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Updating instance_info_cache with network_info: [{"id": "00ed81c9-ecdf-45fa-adc3-b359f43f032b", "address": "fa:16:3e:0a:c9:5a", "network": {"id": "e3aee9db-8596-4ea8-943e-5c365382ee22", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.110", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "f1a3ab6230dd468b8019424ce71de8ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cde23701-02ca-4cb4-b5a6-d321f8ac9660", "external-id": "nsx-vlan-transportzone-586", "segmentation_id": 586, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap00ed81c9-ec", "ovs_interfaceid": "00ed81c9-ecdf-45fa-adc3-b359f43f032b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 841.779319] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 841.779935] env[68617]: DEBUG oslo_concurrency.lockutils [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 841.780106] env[68617]: DEBUG oslo_concurrency.lockutils [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 841.780403] env[68617]: DEBUG oslo_concurrency.lockutils [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 841.780639] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-935e5dba-01cd-48c5-acb2-d11358525ac5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 841.788458] env[68617]: DEBUG oslo_vmware.api [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Waiting for the task: (returnval){ [ 841.788458] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52a3ef25-ec56-7d8f-4dd8-19fc43feb14d" [ 841.788458] env[68617]: _type = "Task" [ 841.788458] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 841.790432] env[68617]: DEBUG oslo_concurrency.lockutils [req-4b5b9500-5031-4c8f-bf1c-1281b212f90a req-c52ee9ce-0766-4bd8-9724-2fe9dedc39db service nova] Releasing lock "refresh_cache-e6b6cbdd-11d6-44a6-8da7-98e0f52cef67" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 841.795590] env[68617]: DEBUG oslo_vmware.api [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52a3ef25-ec56-7d8f-4dd8-19fc43feb14d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 842.296480] env[68617]: DEBUG oslo_concurrency.lockutils [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 842.296754] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 842.296971] env[68617]: DEBUG oslo_concurrency.lockutils [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 848.689375] env[68617]: DEBUG oslo_concurrency.lockutils [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Acquiring lock "dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 848.689658] env[68617]: DEBUG oslo_concurrency.lockutils [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Lock "dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 849.314069] env[68617]: DEBUG oslo_concurrency.lockutils [None req-232bfc7e-96ff-4f45-9df5-245b28c10087 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Acquiring lock "39f1e776-4df9-4b24-a51b-c1a15a943a76" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 849.314415] env[68617]: DEBUG oslo_concurrency.lockutils [None req-232bfc7e-96ff-4f45-9df5-245b28c10087 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Lock "39f1e776-4df9-4b24-a51b-c1a15a943a76" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 880.694607] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 880.719325] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 880.720032] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Starting heal instance info cache {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 880.720032] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Rebuilding the list of instances to heal {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 880.738796] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 880.738867] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 880.738992] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 880.739137] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 880.739261] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 880.739383] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 880.739502] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 880.739893] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 880.739893] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 880.739893] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 880.740089] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Didn't find any instances for network info cache update. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 883.698578] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 883.698894] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 883.699053] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 883.699194] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 883.699337] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68617) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 884.694580] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 884.698286] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 884.698510] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 884.710139] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 884.710493] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 884.710493] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 884.710560] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 884.711628] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4068de57-6576-4f94-aae4-5c7a941a52bf {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 884.721066] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d784156-c341-4a29-9b9c-f931bad7b562 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 884.736197] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c83db98-160d-4823-aee9-c4f4378cae73 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 884.742974] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38d9f786-a323-4814-850d-9a565e325584 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 884.772009] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180886MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 884.772219] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 884.772488] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 884.846021] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 5f4991a3-c34b-45b1-a3af-94d7d990eef1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 884.846234] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b95883b2-0366-4f52-bdf2-aa6259fafc58 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 884.846364] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 050e2b27-1311-4a9a-b5cf-6bc2f7128eba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 884.846483] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f13242a0-7e65-4d68-a317-16fb8c4b8f8a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 884.846875] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 3b95678b-dfc5-4610-a51e-2ae12fbe274b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 884.846875] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6300077d-5aa7-4794-8ba2-1ec30151c15c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 884.846875] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 4ea5887f-84bd-4629-b568-e73c78af0ad4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 884.847040] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6eef6e24-cf49-458b-ae37-8da4e02045f8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 884.847078] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 71b1ebba-2019-4378-9bd2-98a7559c22e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 884.847180] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance e6b6cbdd-11d6-44a6-8da7-98e0f52cef67 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 884.858684] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b27ace75-e2fa-4acc-96cb-88dd49b89de5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 884.869106] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 152f9e1d-dd1b-486f-94b8-8202c0f2d335 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 884.879911] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 9d10a63c-4c97-48c3-aca8-fd317aa2fbe7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 884.889424] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance e6e6c910-9485-48b0-bffa-4534cd7f87d4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 884.898448] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 1ec954d1-1bc9-4db3-9a48-7da759cebf21 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 884.907459] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 7d51d3c0-12fd-4118-80c6-16c1cca346db has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 884.916785] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance a43cf82a-c969-47eb-b8dc-d7fe7f7870d3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 884.926532] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 40c6521b-51d9-45cf-959c-21e4f3da7eb9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 884.944750] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance dae068af-0c54-4715-bdc3-ecfd018b6294 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 884.954348] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance ee6e18cd-9af2-4440-8336-9e1858c28709 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 884.963557] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 995585f5-57a4-4ba6-9e28-18a086af264c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 884.973499] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 53abf4e7-35ea-415b-8a90-a89442c475a1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 884.983812] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 8d0a643a-96c5-4d47-aa2a-9f777e80c259 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 884.995617] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6341f8b6-9f42-4ab2-806f-dbad62de5376 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 885.005624] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance cc32959e-71ea-44cb-aebe-bf6a893ebb18 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 885.017243] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 704e4d83-19ef-493a-a374-dce0de95e975 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 885.027093] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance df9a6dc4-abb5-4855-ac4f-5479dd0b6498 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 885.038039] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f6f64438-8279-4ff4-ab80-efd1e17d7e04 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 885.048117] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f2bef6cc-f5e2-41a8-b377-31f016746257 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 885.058311] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance fe74e8d8-e439-4834-9721-08d9e64c7740 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 885.067911] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 82864ac3-a199-478c-8c57-97ea0a256201 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 885.078738] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 885.088895] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 39f1e776-4df9-4b24-a51b-c1a15a943a76 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 885.089171] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 885.089340] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 885.495315] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e34ddc5-5e06-4c8d-8e35-b238b8d2098a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 885.503213] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d531d80e-7bf6-47c5-877b-78a29f42202a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 885.532323] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d3f9474-ace8-4041-a83c-87d847963879 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 885.539690] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86362936-59cc-4af6-bb59-92c912230c15 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 885.552346] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 885.563413] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 885.577021] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 885.577021] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.804s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 886.578475] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 887.658531] env[68617]: WARNING oslo_vmware.rw_handles [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 887.658531] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 887.658531] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 887.658531] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 887.658531] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 887.658531] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 887.658531] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 887.658531] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 887.658531] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 887.658531] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 887.658531] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 887.658531] env[68617]: ERROR oslo_vmware.rw_handles [ 887.658531] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/60e525eb-dd30-4755-8d91-8c40c91cefe1/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 887.660360] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 887.660578] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Copying Virtual Disk [datastore2] vmware_temp/60e525eb-dd30-4755-8d91-8c40c91cefe1/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/60e525eb-dd30-4755-8d91-8c40c91cefe1/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 887.660864] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-e9cc52bb-e996-471c-89eb-739775d8a8fb {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 887.669211] env[68617]: DEBUG oslo_vmware.api [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Waiting for the task: (returnval){ [ 887.669211] env[68617]: value = "task-3470741" [ 887.669211] env[68617]: _type = "Task" [ 887.669211] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 887.677170] env[68617]: DEBUG oslo_vmware.api [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Task: {'id': task-3470741, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 888.179073] env[68617]: DEBUG oslo_vmware.exceptions [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 888.179552] env[68617]: DEBUG oslo_concurrency.lockutils [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 888.179950] env[68617]: ERROR nova.compute.manager [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 888.179950] env[68617]: Faults: ['InvalidArgument'] [ 888.179950] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Traceback (most recent call last): [ 888.179950] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 888.179950] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] yield resources [ 888.179950] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 888.179950] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] self.driver.spawn(context, instance, image_meta, [ 888.179950] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 888.179950] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 888.179950] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 888.179950] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] self._fetch_image_if_missing(context, vi) [ 888.179950] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 888.180318] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] image_cache(vi, tmp_image_ds_loc) [ 888.180318] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 888.180318] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] vm_util.copy_virtual_disk( [ 888.180318] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 888.180318] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] session._wait_for_task(vmdk_copy_task) [ 888.180318] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 888.180318] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] return self.wait_for_task(task_ref) [ 888.180318] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 888.180318] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] return evt.wait() [ 888.180318] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 888.180318] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] result = hub.switch() [ 888.180318] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 888.180318] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] return self.greenlet.switch() [ 888.180694] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 888.180694] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] self.f(*self.args, **self.kw) [ 888.180694] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 888.180694] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] raise exceptions.translate_fault(task_info.error) [ 888.180694] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 888.180694] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Faults: ['InvalidArgument'] [ 888.180694] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] [ 888.180694] env[68617]: INFO nova.compute.manager [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Terminating instance [ 888.181762] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 888.181962] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 888.182210] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0b5c3558-b17a-4292-9de3-94e6f9f68c26 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 888.184695] env[68617]: DEBUG nova.compute.manager [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 888.184986] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 888.185831] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91be5ce0-949c-43b8-a629-1f56ef08932d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 888.193282] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 888.193465] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-361b572a-71b3-4d01-9cbd-8a3af480cae9 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 888.195543] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 888.195723] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 888.196700] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3e1779cc-2579-4031-a903-724eef495515 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 888.201290] env[68617]: DEBUG oslo_vmware.api [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Waiting for the task: (returnval){ [ 888.201290] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52ff6dd3-b18b-108e-ca1c-adbfb028e56d" [ 888.201290] env[68617]: _type = "Task" [ 888.201290] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 888.208354] env[68617]: DEBUG oslo_vmware.api [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52ff6dd3-b18b-108e-ca1c-adbfb028e56d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 888.261813] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 888.262143] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 888.262325] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Deleting the datastore file [datastore2] 5f4991a3-c34b-45b1-a3af-94d7d990eef1 {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 888.262575] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b1664f7c-ea20-4811-b5c7-0f04e6b28e60 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 888.268216] env[68617]: DEBUG oslo_vmware.api [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Waiting for the task: (returnval){ [ 888.268216] env[68617]: value = "task-3470743" [ 888.268216] env[68617]: _type = "Task" [ 888.268216] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 888.275813] env[68617]: DEBUG oslo_vmware.api [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Task: {'id': task-3470743, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 888.711869] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 888.712153] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Creating directory with path [datastore2] vmware_temp/ca7b2a39-463c-4c8f-83e9-5edadd25ffdf/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 888.712385] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-eb491a84-229e-4f45-8c9c-6df6fdffd71a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 888.724515] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Created directory with path [datastore2] vmware_temp/ca7b2a39-463c-4c8f-83e9-5edadd25ffdf/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 888.724734] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Fetch image to [datastore2] vmware_temp/ca7b2a39-463c-4c8f-83e9-5edadd25ffdf/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 888.724904] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/ca7b2a39-463c-4c8f-83e9-5edadd25ffdf/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 888.725676] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b99d3d67-75a1-4db9-b5f1-02872b526683 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 888.732967] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a1a0bc1-1fbb-4366-848f-5eac6b9a382d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 888.742430] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c76217d7-def8-475a-8d0e-a2ec6ecfb47e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 888.777511] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c4d6c8f-b509-42de-9269-14314d64e625 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 888.786359] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-d0c6f1fc-2b9b-4b81-8236-d52c0f7fd42b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 888.788073] env[68617]: DEBUG oslo_vmware.api [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Task: {'id': task-3470743, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067173} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 888.788233] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 888.788415] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 888.788579] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 888.788749] env[68617]: INFO nova.compute.manager [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Took 0.60 seconds to destroy the instance on the hypervisor. [ 888.790835] env[68617]: DEBUG nova.compute.claims [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 888.791010] env[68617]: DEBUG oslo_concurrency.lockutils [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 888.791233] env[68617]: DEBUG oslo_concurrency.lockutils [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 888.808010] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 888.860691] env[68617]: DEBUG oslo_vmware.rw_handles [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ca7b2a39-463c-4c8f-83e9-5edadd25ffdf/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 888.921720] env[68617]: DEBUG oslo_vmware.rw_handles [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 888.921954] env[68617]: DEBUG oslo_vmware.rw_handles [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ca7b2a39-463c-4c8f-83e9-5edadd25ffdf/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 889.271031] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7cbcaa00-5f6a-4826-8ea4-0a910d96ff19 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 889.278837] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-425f6fd4-803e-4fc0-b0ac-58fa6f641b69 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 889.308398] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff50e0e5-0639-41bc-b5c7-6f88190e73e1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 889.315208] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6cfd0461-3888-4ea2-83d7-b4d4e053d722 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 889.327862] env[68617]: DEBUG nova.compute.provider_tree [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 889.336894] env[68617]: DEBUG nova.scheduler.client.report [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 889.349694] env[68617]: DEBUG oslo_concurrency.lockutils [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.558s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 889.350221] env[68617]: ERROR nova.compute.manager [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 889.350221] env[68617]: Faults: ['InvalidArgument'] [ 889.350221] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Traceback (most recent call last): [ 889.350221] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 889.350221] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] self.driver.spawn(context, instance, image_meta, [ 889.350221] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 889.350221] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 889.350221] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 889.350221] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] self._fetch_image_if_missing(context, vi) [ 889.350221] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 889.350221] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] image_cache(vi, tmp_image_ds_loc) [ 889.350221] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 889.350530] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] vm_util.copy_virtual_disk( [ 889.350530] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 889.350530] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] session._wait_for_task(vmdk_copy_task) [ 889.350530] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 889.350530] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] return self.wait_for_task(task_ref) [ 889.350530] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 889.350530] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] return evt.wait() [ 889.350530] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 889.350530] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] result = hub.switch() [ 889.350530] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 889.350530] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] return self.greenlet.switch() [ 889.350530] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 889.350530] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] self.f(*self.args, **self.kw) [ 889.350821] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 889.350821] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] raise exceptions.translate_fault(task_info.error) [ 889.350821] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 889.350821] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Faults: ['InvalidArgument'] [ 889.350821] env[68617]: ERROR nova.compute.manager [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] [ 889.350940] env[68617]: DEBUG nova.compute.utils [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 889.352169] env[68617]: DEBUG nova.compute.manager [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Build of instance 5f4991a3-c34b-45b1-a3af-94d7d990eef1 was re-scheduled: A specified parameter was not correct: fileType [ 889.352169] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 889.352554] env[68617]: DEBUG nova.compute.manager [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 889.352724] env[68617]: DEBUG nova.compute.manager [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 889.352887] env[68617]: DEBUG nova.compute.manager [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 889.353060] env[68617]: DEBUG nova.network.neutron [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 889.823420] env[68617]: DEBUG nova.network.neutron [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 889.839121] env[68617]: INFO nova.compute.manager [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] [instance: 5f4991a3-c34b-45b1-a3af-94d7d990eef1] Took 0.49 seconds to deallocate network for instance. [ 889.946572] env[68617]: INFO nova.scheduler.client.report [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Deleted allocations for instance 5f4991a3-c34b-45b1-a3af-94d7d990eef1 [ 889.971744] env[68617]: DEBUG oslo_concurrency.lockutils [None req-68707239-de34-4fd0-9dc3-a680003fba87 tempest-ImagesOneServerTestJSON-1712868828 tempest-ImagesOneServerTestJSON-1712868828-project-member] Lock "5f4991a3-c34b-45b1-a3af-94d7d990eef1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 197.326s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 889.986132] env[68617]: DEBUG nova.compute.manager [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 890.036829] env[68617]: DEBUG oslo_concurrency.lockutils [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 890.037097] env[68617]: DEBUG oslo_concurrency.lockutils [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 890.038558] env[68617]: INFO nova.compute.claims [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 890.450678] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93a1f500-99c8-4933-85a6-acf978269e16 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 890.464019] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1c3d6fe-4873-4caf-a66b-3f148cf72ae8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 890.495312] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bd795c2-cb8f-4425-af59-c7cea4e35c1a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 890.504631] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5d67793-3de7-4fbf-ad3b-6fd8e93030c2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 890.518954] env[68617]: DEBUG nova.compute.provider_tree [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 890.529755] env[68617]: DEBUG nova.scheduler.client.report [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 890.545395] env[68617]: DEBUG oslo_concurrency.lockutils [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.508s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 890.545871] env[68617]: DEBUG nova.compute.manager [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 890.581040] env[68617]: DEBUG nova.compute.utils [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 890.581706] env[68617]: DEBUG nova.compute.manager [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 890.581874] env[68617]: DEBUG nova.network.neutron [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 890.592470] env[68617]: DEBUG nova.compute.manager [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 890.641746] env[68617]: DEBUG nova.policy [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2c10f61e025e469890198c323de0578b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '828be21ced7d4d11a462ae49d04280ba', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 890.655387] env[68617]: DEBUG nova.compute.manager [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 890.681608] env[68617]: DEBUG nova.virt.hardware [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='84',id=12,is_public=True,memory_mb=192,name='m1.micro',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 890.681789] env[68617]: DEBUG nova.virt.hardware [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 890.681937] env[68617]: DEBUG nova.virt.hardware [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 890.682125] env[68617]: DEBUG nova.virt.hardware [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 890.682269] env[68617]: DEBUG nova.virt.hardware [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 890.682411] env[68617]: DEBUG nova.virt.hardware [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 890.682615] env[68617]: DEBUG nova.virt.hardware [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 890.682767] env[68617]: DEBUG nova.virt.hardware [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 890.682928] env[68617]: DEBUG nova.virt.hardware [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 890.683100] env[68617]: DEBUG nova.virt.hardware [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 890.683271] env[68617]: DEBUG nova.virt.hardware [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 890.684127] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85c5a8c1-3731-4af7-a57a-5fdcd92de9b6 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 890.692906] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75f550f6-ef3f-4e54-8f98-22cbcdf4861c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 890.948145] env[68617]: DEBUG nova.network.neutron [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Successfully created port: 03cc80bd-ed65-4c79-ad95-90b2b50a3bf3 {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 891.774577] env[68617]: DEBUG nova.network.neutron [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Successfully updated port: 03cc80bd-ed65-4c79-ad95-90b2b50a3bf3 {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 891.789026] env[68617]: DEBUG oslo_concurrency.lockutils [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquiring lock "refresh_cache-b27ace75-e2fa-4acc-96cb-88dd49b89de5" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 891.789171] env[68617]: DEBUG oslo_concurrency.lockutils [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquired lock "refresh_cache-b27ace75-e2fa-4acc-96cb-88dd49b89de5" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 891.789315] env[68617]: DEBUG nova.network.neutron [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 891.854850] env[68617]: DEBUG nova.network.neutron [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 892.304693] env[68617]: DEBUG nova.network.neutron [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Updating instance_info_cache with network_info: [{"id": "03cc80bd-ed65-4c79-ad95-90b2b50a3bf3", "address": "fa:16:3e:3a:40:69", "network": {"id": "0f2e6893-43e2-458a-8326-dd03f1a6b1a7", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2034507765-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "828be21ced7d4d11a462ae49d04280ba", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "49b5df12-d801-4140-8816-2fd401608c7d", "external-id": "nsx-vlan-transportzone-326", "segmentation_id": 326, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap03cc80bd-ed", "ovs_interfaceid": "03cc80bd-ed65-4c79-ad95-90b2b50a3bf3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 892.323200] env[68617]: DEBUG oslo_concurrency.lockutils [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Releasing lock "refresh_cache-b27ace75-e2fa-4acc-96cb-88dd49b89de5" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 892.323200] env[68617]: DEBUG nova.compute.manager [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Instance network_info: |[{"id": "03cc80bd-ed65-4c79-ad95-90b2b50a3bf3", "address": "fa:16:3e:3a:40:69", "network": {"id": "0f2e6893-43e2-458a-8326-dd03f1a6b1a7", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2034507765-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "828be21ced7d4d11a462ae49d04280ba", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "49b5df12-d801-4140-8816-2fd401608c7d", "external-id": "nsx-vlan-transportzone-326", "segmentation_id": 326, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap03cc80bd-ed", "ovs_interfaceid": "03cc80bd-ed65-4c79-ad95-90b2b50a3bf3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 892.323421] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:3a:40:69', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '49b5df12-d801-4140-8816-2fd401608c7d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '03cc80bd-ed65-4c79-ad95-90b2b50a3bf3', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 892.331089] env[68617]: DEBUG oslo.service.loopingcall [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 892.331735] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 892.331870] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-aa22dcd9-bfa4-4654-bf48-5cc797771bd5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 892.352518] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 892.352518] env[68617]: value = "task-3470744" [ 892.352518] env[68617]: _type = "Task" [ 892.352518] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 892.363890] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470744, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 892.372583] env[68617]: DEBUG nova.compute.manager [req-167b6770-81b9-4940-958c-1ab9cf60d800 req-8a7ed568-2877-4c0b-922c-1cc877b1a232 service nova] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Received event network-vif-plugged-03cc80bd-ed65-4c79-ad95-90b2b50a3bf3 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 892.372794] env[68617]: DEBUG oslo_concurrency.lockutils [req-167b6770-81b9-4940-958c-1ab9cf60d800 req-8a7ed568-2877-4c0b-922c-1cc877b1a232 service nova] Acquiring lock "b27ace75-e2fa-4acc-96cb-88dd49b89de5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 892.372987] env[68617]: DEBUG oslo_concurrency.lockutils [req-167b6770-81b9-4940-958c-1ab9cf60d800 req-8a7ed568-2877-4c0b-922c-1cc877b1a232 service nova] Lock "b27ace75-e2fa-4acc-96cb-88dd49b89de5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 892.373168] env[68617]: DEBUG oslo_concurrency.lockutils [req-167b6770-81b9-4940-958c-1ab9cf60d800 req-8a7ed568-2877-4c0b-922c-1cc877b1a232 service nova] Lock "b27ace75-e2fa-4acc-96cb-88dd49b89de5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 892.373330] env[68617]: DEBUG nova.compute.manager [req-167b6770-81b9-4940-958c-1ab9cf60d800 req-8a7ed568-2877-4c0b-922c-1cc877b1a232 service nova] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] No waiting events found dispatching network-vif-plugged-03cc80bd-ed65-4c79-ad95-90b2b50a3bf3 {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 892.373491] env[68617]: WARNING nova.compute.manager [req-167b6770-81b9-4940-958c-1ab9cf60d800 req-8a7ed568-2877-4c0b-922c-1cc877b1a232 service nova] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Received unexpected event network-vif-plugged-03cc80bd-ed65-4c79-ad95-90b2b50a3bf3 for instance with vm_state building and task_state spawning. [ 892.373638] env[68617]: DEBUG nova.compute.manager [req-167b6770-81b9-4940-958c-1ab9cf60d800 req-8a7ed568-2877-4c0b-922c-1cc877b1a232 service nova] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Received event network-changed-03cc80bd-ed65-4c79-ad95-90b2b50a3bf3 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 892.373788] env[68617]: DEBUG nova.compute.manager [req-167b6770-81b9-4940-958c-1ab9cf60d800 req-8a7ed568-2877-4c0b-922c-1cc877b1a232 service nova] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Refreshing instance network info cache due to event network-changed-03cc80bd-ed65-4c79-ad95-90b2b50a3bf3. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 892.373997] env[68617]: DEBUG oslo_concurrency.lockutils [req-167b6770-81b9-4940-958c-1ab9cf60d800 req-8a7ed568-2877-4c0b-922c-1cc877b1a232 service nova] Acquiring lock "refresh_cache-b27ace75-e2fa-4acc-96cb-88dd49b89de5" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 892.374149] env[68617]: DEBUG oslo_concurrency.lockutils [req-167b6770-81b9-4940-958c-1ab9cf60d800 req-8a7ed568-2877-4c0b-922c-1cc877b1a232 service nova] Acquired lock "refresh_cache-b27ace75-e2fa-4acc-96cb-88dd49b89de5" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 892.374307] env[68617]: DEBUG nova.network.neutron [req-167b6770-81b9-4940-958c-1ab9cf60d800 req-8a7ed568-2877-4c0b-922c-1cc877b1a232 service nova] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Refreshing network info cache for port 03cc80bd-ed65-4c79-ad95-90b2b50a3bf3 {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 892.709541] env[68617]: DEBUG nova.network.neutron [req-167b6770-81b9-4940-958c-1ab9cf60d800 req-8a7ed568-2877-4c0b-922c-1cc877b1a232 service nova] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Updated VIF entry in instance network info cache for port 03cc80bd-ed65-4c79-ad95-90b2b50a3bf3. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 892.709541] env[68617]: DEBUG nova.network.neutron [req-167b6770-81b9-4940-958c-1ab9cf60d800 req-8a7ed568-2877-4c0b-922c-1cc877b1a232 service nova] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Updating instance_info_cache with network_info: [{"id": "03cc80bd-ed65-4c79-ad95-90b2b50a3bf3", "address": "fa:16:3e:3a:40:69", "network": {"id": "0f2e6893-43e2-458a-8326-dd03f1a6b1a7", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2034507765-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "828be21ced7d4d11a462ae49d04280ba", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "49b5df12-d801-4140-8816-2fd401608c7d", "external-id": "nsx-vlan-transportzone-326", "segmentation_id": 326, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap03cc80bd-ed", "ovs_interfaceid": "03cc80bd-ed65-4c79-ad95-90b2b50a3bf3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 892.720678] env[68617]: DEBUG oslo_concurrency.lockutils [req-167b6770-81b9-4940-958c-1ab9cf60d800 req-8a7ed568-2877-4c0b-922c-1cc877b1a232 service nova] Releasing lock "refresh_cache-b27ace75-e2fa-4acc-96cb-88dd49b89de5" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 892.871032] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470744, 'name': CreateVM_Task, 'duration_secs': 0.316355} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 892.871032] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 892.871032] env[68617]: DEBUG oslo_concurrency.lockutils [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 892.871032] env[68617]: DEBUG oslo_concurrency.lockutils [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 892.871032] env[68617]: DEBUG oslo_concurrency.lockutils [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 892.871279] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f334aa71-6ac5-4017-af05-66302655e6ae {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 892.877660] env[68617]: DEBUG oslo_vmware.api [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Waiting for the task: (returnval){ [ 892.877660] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52ea2642-58d8-4f1c-9710-484a50062915" [ 892.877660] env[68617]: _type = "Task" [ 892.877660] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 892.885304] env[68617]: DEBUG oslo_vmware.api [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52ea2642-58d8-4f1c-9710-484a50062915, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 893.389947] env[68617]: DEBUG oslo_concurrency.lockutils [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 893.390263] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 893.390441] env[68617]: DEBUG oslo_concurrency.lockutils [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 894.458225] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f9d6cc47-c1e8-4702-bb86-2d369ab71d22 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Acquiring lock "b95883b2-0366-4f52-bdf2-aa6259fafc58" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 900.724034] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Acquiring lock "79c92a1b-20ef-4360-93b4-913cbfcf92fe" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 900.724365] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Lock "79c92a1b-20ef-4360-93b4-913cbfcf92fe" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 901.060981] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9263fd01-4686-49e9-a410-a88e49136d17 tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Acquiring lock "40de8cd1-1c46-4ffb-866b-255386fe44b6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 901.061302] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9263fd01-4686-49e9-a410-a88e49136d17 tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Lock "40de8cd1-1c46-4ffb-866b-255386fe44b6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 901.822813] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f7e12e17-4e6f-4de7-8a04-8e13af42ffe6 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Acquiring lock "050e2b27-1311-4a9a-b5cf-6bc2f7128eba" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 903.004501] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3688968f-917b-4fd9-807f-5df2c4a88464 tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Acquiring lock "3b95678b-dfc5-4610-a51e-2ae12fbe274b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 904.363286] env[68617]: DEBUG oslo_concurrency.lockutils [None req-518992bc-62fb-4e9a-b170-108b82689824 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Acquiring lock "6300077d-5aa7-4794-8ba2-1ec30151c15c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 908.328560] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6bec6ec9-d9dc-4a85-a09e-ef0a836883fb tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "6eef6e24-cf49-458b-ae37-8da4e02045f8" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 909.832656] env[68617]: DEBUG oslo_concurrency.lockutils [None req-02678a3c-83cc-4b14-9b23-d90f9e93cbd7 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Acquiring lock "e6b6cbdd-11d6-44a6-8da7-98e0f52cef67" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 910.588666] env[68617]: DEBUG oslo_concurrency.lockutils [None req-22536537-b16f-43a6-9dc4-cf2c2d6e35ed tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Acquiring lock "71b1ebba-2019-4378-9bd2-98a7559c22e8" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 910.866401] env[68617]: DEBUG oslo_concurrency.lockutils [None req-510bb851-f7ec-43a9-98d3-f0340658b54a tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquiring lock "b27ace75-e2fa-4acc-96cb-88dd49b89de5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 913.940132] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f5125ede-cdae-41f7-b164-3802b3036641 tempest-VolumesAdminNegativeTest-561724217 tempest-VolumesAdminNegativeTest-561724217-project-member] Acquiring lock "1cc76382-5452-4ed4-bb99-c6800c70d42a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 913.940433] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f5125ede-cdae-41f7-b164-3802b3036641 tempest-VolumesAdminNegativeTest-561724217 tempest-VolumesAdminNegativeTest-561724217-project-member] Lock "1cc76382-5452-4ed4-bb99-c6800c70d42a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 923.951610] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d6c7d1a5-b7ee-48d5-b286-c143104f8926 tempest-ServersTestMultiNic-884689889 tempest-ServersTestMultiNic-884689889-project-member] Acquiring lock "3258d1a5-7142-4e06-814d-e68fd90262ae" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 923.951913] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d6c7d1a5-b7ee-48d5-b286-c143104f8926 tempest-ServersTestMultiNic-884689889 tempest-ServersTestMultiNic-884689889-project-member] Lock "3258d1a5-7142-4e06-814d-e68fd90262ae" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 933.074765] env[68617]: DEBUG oslo_concurrency.lockutils [None req-df11a94d-2e23-4a9d-904f-e8df2d8982ce tempest-MigrationsAdminTest-1112293401 tempest-MigrationsAdminTest-1112293401-project-member] Acquiring lock "e3a2fb7d-b092-485f-b64a-486c458ba845" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 933.075538] env[68617]: DEBUG oslo_concurrency.lockutils [None req-df11a94d-2e23-4a9d-904f-e8df2d8982ce tempest-MigrationsAdminTest-1112293401 tempest-MigrationsAdminTest-1112293401-project-member] Lock "e3a2fb7d-b092-485f-b64a-486c458ba845" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 934.584101] env[68617]: DEBUG oslo_concurrency.lockutils [None req-b221c568-8686-45ed-a2d9-100ed1519d21 tempest-SecurityGroupsTestJSON-1069621129 tempest-SecurityGroupsTestJSON-1069621129-project-member] Acquiring lock "eaeae56d-8e71-43bc-8441-49a29c161763" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 934.584101] env[68617]: DEBUG oslo_concurrency.lockutils [None req-b221c568-8686-45ed-a2d9-100ed1519d21 tempest-SecurityGroupsTestJSON-1069621129 tempest-SecurityGroupsTestJSON-1069621129-project-member] Lock "eaeae56d-8e71-43bc-8441-49a29c161763" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 937.674171] env[68617]: WARNING oslo_vmware.rw_handles [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 937.674171] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 937.674171] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 937.674171] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 937.674171] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 937.674171] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 937.674171] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 937.674171] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 937.674171] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 937.674171] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 937.674171] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 937.674171] env[68617]: ERROR oslo_vmware.rw_handles [ 937.674805] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/ca7b2a39-463c-4c8f-83e9-5edadd25ffdf/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 937.676173] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 937.676452] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Copying Virtual Disk [datastore2] vmware_temp/ca7b2a39-463c-4c8f-83e9-5edadd25ffdf/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/ca7b2a39-463c-4c8f-83e9-5edadd25ffdf/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 937.676758] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-8b3c5678-92fa-4cfb-a3d3-a326bd7d7e18 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 937.684917] env[68617]: DEBUG oslo_vmware.api [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Waiting for the task: (returnval){ [ 937.684917] env[68617]: value = "task-3470745" [ 937.684917] env[68617]: _type = "Task" [ 937.684917] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 937.692765] env[68617]: DEBUG oslo_vmware.api [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Task: {'id': task-3470745, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 938.195683] env[68617]: DEBUG oslo_vmware.exceptions [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 938.195993] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 938.196606] env[68617]: ERROR nova.compute.manager [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 938.196606] env[68617]: Faults: ['InvalidArgument'] [ 938.196606] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Traceback (most recent call last): [ 938.196606] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 938.196606] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] yield resources [ 938.196606] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 938.196606] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] self.driver.spawn(context, instance, image_meta, [ 938.196606] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 938.196606] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] self._vmops.spawn(context, instance, image_meta, injected_files, [ 938.196606] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 938.196606] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] self._fetch_image_if_missing(context, vi) [ 938.196606] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 938.197045] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] image_cache(vi, tmp_image_ds_loc) [ 938.197045] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 938.197045] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] vm_util.copy_virtual_disk( [ 938.197045] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 938.197045] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] session._wait_for_task(vmdk_copy_task) [ 938.197045] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 938.197045] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] return self.wait_for_task(task_ref) [ 938.197045] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 938.197045] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] return evt.wait() [ 938.197045] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 938.197045] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] result = hub.switch() [ 938.197045] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 938.197045] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] return self.greenlet.switch() [ 938.197523] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 938.197523] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] self.f(*self.args, **self.kw) [ 938.197523] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 938.197523] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] raise exceptions.translate_fault(task_info.error) [ 938.197523] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 938.197523] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Faults: ['InvalidArgument'] [ 938.197523] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] [ 938.197523] env[68617]: INFO nova.compute.manager [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Terminating instance [ 938.198633] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 938.198745] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 938.199335] env[68617]: DEBUG nova.compute.manager [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 938.199524] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 938.199744] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-685a9b95-193b-4283-afa7-e0db80e34445 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 938.201933] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5500421-700a-40f1-af4f-6382f5e27786 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 938.208822] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 938.209036] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-8cbff1eb-d822-4160-92d7-3231f29c02a1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 938.211174] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 938.211344] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 938.212271] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a11e1456-0204-4526-9187-ddfc57ccefdc {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 938.216901] env[68617]: DEBUG oslo_vmware.api [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Waiting for the task: (returnval){ [ 938.216901] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52c5fb90-ad37-b5b0-3af3-0101e23e551c" [ 938.216901] env[68617]: _type = "Task" [ 938.216901] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 938.224824] env[68617]: DEBUG oslo_vmware.api [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52c5fb90-ad37-b5b0-3af3-0101e23e551c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 938.726759] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 938.727034] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Creating directory with path [datastore2] vmware_temp/2ef72850-9e1a-4f26-954d-f297e01eeb9f/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 938.727290] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f5c22f75-3443-49f7-8655-eed536a1ba04 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 938.745146] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Created directory with path [datastore2] vmware_temp/2ef72850-9e1a-4f26-954d-f297e01eeb9f/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 938.745338] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Fetch image to [datastore2] vmware_temp/2ef72850-9e1a-4f26-954d-f297e01eeb9f/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 938.745507] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/2ef72850-9e1a-4f26-954d-f297e01eeb9f/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 938.746320] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2011682-cb4e-4bc3-9ff5-d553d4ec8de3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 938.754199] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29cd9690-485e-4a8d-8e27-53da37cebec1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 938.762962] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32095cb6-7933-4a66-a271-770fbcc5229b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 938.793955] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71499354-8929-4a59-8da1-da91e9a7bda1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 938.799376] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7639f9e6-1500-4df6-ba3a-18034f999315 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 938.829513] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 938.875370] env[68617]: DEBUG oslo_vmware.rw_handles [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2ef72850-9e1a-4f26-954d-f297e01eeb9f/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 938.933942] env[68617]: DEBUG oslo_vmware.rw_handles [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 938.933942] env[68617]: DEBUG oslo_vmware.rw_handles [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2ef72850-9e1a-4f26-954d-f297e01eeb9f/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 939.854783] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 939.854783] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 939.854783] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Deleting the datastore file [datastore2] b95883b2-0366-4f52-bdf2-aa6259fafc58 {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 939.855384] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1fd72389-14f7-47bf-917d-c6107dfcee29 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 939.865152] env[68617]: DEBUG oslo_vmware.api [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Waiting for the task: (returnval){ [ 939.865152] env[68617]: value = "task-3470747" [ 939.865152] env[68617]: _type = "Task" [ 939.865152] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 939.872946] env[68617]: DEBUG oslo_vmware.api [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Task: {'id': task-3470747, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 940.329776] env[68617]: DEBUG oslo_concurrency.lockutils [None req-37833af3-316f-48e8-98d4-ffffff3a5894 tempest-ServerRescueNegativeTestJSON-1564947093 tempest-ServerRescueNegativeTestJSON-1564947093-project-member] Acquiring lock "79d8a532-b071-4c79-8c5d-f08438928201" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 940.330424] env[68617]: DEBUG oslo_concurrency.lockutils [None req-37833af3-316f-48e8-98d4-ffffff3a5894 tempest-ServerRescueNegativeTestJSON-1564947093 tempest-ServerRescueNegativeTestJSON-1564947093-project-member] Lock "79d8a532-b071-4c79-8c5d-f08438928201" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 940.376705] env[68617]: DEBUG oslo_vmware.api [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Task: {'id': task-3470747, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.197089} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 940.377045] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 940.378118] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 940.378425] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 940.378614] env[68617]: INFO nova.compute.manager [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Took 2.18 seconds to destroy the instance on the hypervisor. [ 940.382607] env[68617]: DEBUG nova.compute.claims [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 940.382689] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 940.382921] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 940.699251] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 940.699407] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Cleaning up deleted instances {{(pid=68617) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 940.717433] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] There are 0 instances to clean {{(pid=68617) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 940.717977] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 940.718142] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Cleaning up deleted instances with incomplete migration {{(pid=68617) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 940.727178] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 940.914086] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e12fc01-c367-4f80-a455-e9a63c863cc3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 940.925179] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7bf64e2e-a4b2-4c11-898a-cb512b44ccc2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 940.956786] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a739009a-4c18-4bae-8f4f-706ccab7a7ad {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 940.964226] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7da42121-e50a-40f9-b837-fe012e2bd45b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 940.978708] env[68617]: DEBUG nova.compute.provider_tree [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 940.992180] env[68617]: DEBUG nova.scheduler.client.report [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 941.011505] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.627s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 941.011505] env[68617]: ERROR nova.compute.manager [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 941.011505] env[68617]: Faults: ['InvalidArgument'] [ 941.011505] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Traceback (most recent call last): [ 941.011505] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 941.011505] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] self.driver.spawn(context, instance, image_meta, [ 941.011505] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 941.011505] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] self._vmops.spawn(context, instance, image_meta, injected_files, [ 941.011505] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 941.011505] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] self._fetch_image_if_missing(context, vi) [ 941.011772] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 941.011772] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] image_cache(vi, tmp_image_ds_loc) [ 941.011772] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 941.011772] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] vm_util.copy_virtual_disk( [ 941.011772] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 941.011772] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] session._wait_for_task(vmdk_copy_task) [ 941.011772] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 941.011772] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] return self.wait_for_task(task_ref) [ 941.011772] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 941.011772] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] return evt.wait() [ 941.011772] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 941.011772] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] result = hub.switch() [ 941.011772] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 941.012103] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] return self.greenlet.switch() [ 941.012103] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 941.012103] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] self.f(*self.args, **self.kw) [ 941.012103] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 941.012103] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] raise exceptions.translate_fault(task_info.error) [ 941.012103] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 941.012103] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Faults: ['InvalidArgument'] [ 941.012103] env[68617]: ERROR nova.compute.manager [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] [ 941.012103] env[68617]: DEBUG nova.compute.utils [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 941.016017] env[68617]: DEBUG nova.compute.manager [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Build of instance b95883b2-0366-4f52-bdf2-aa6259fafc58 was re-scheduled: A specified parameter was not correct: fileType [ 941.016017] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 941.016017] env[68617]: DEBUG nova.compute.manager [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 941.016017] env[68617]: DEBUG nova.compute.manager [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 941.016017] env[68617]: DEBUG nova.compute.manager [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 941.016276] env[68617]: DEBUG nova.network.neutron [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 941.380685] env[68617]: DEBUG nova.network.neutron [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 941.394862] env[68617]: INFO nova.compute.manager [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Took 0.38 seconds to deallocate network for instance. [ 941.528221] env[68617]: INFO nova.scheduler.client.report [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Deleted allocations for instance b95883b2-0366-4f52-bdf2-aa6259fafc58 [ 941.558823] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9a4725be-5a60-45be-85f0-e82f9eb2dc99 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Lock "b95883b2-0366-4f52-bdf2-aa6259fafc58" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 246.551s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 941.560047] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f9d6cc47-c1e8-4702-bb86-2d369ab71d22 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Lock "b95883b2-0366-4f52-bdf2-aa6259fafc58" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 47.102s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 941.560243] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f9d6cc47-c1e8-4702-bb86-2d369ab71d22 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Acquiring lock "b95883b2-0366-4f52-bdf2-aa6259fafc58-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 941.560457] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f9d6cc47-c1e8-4702-bb86-2d369ab71d22 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Lock "b95883b2-0366-4f52-bdf2-aa6259fafc58-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 941.560681] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f9d6cc47-c1e8-4702-bb86-2d369ab71d22 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Lock "b95883b2-0366-4f52-bdf2-aa6259fafc58-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 941.564679] env[68617]: INFO nova.compute.manager [None req-f9d6cc47-c1e8-4702-bb86-2d369ab71d22 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Terminating instance [ 941.567355] env[68617]: DEBUG nova.compute.manager [None req-f9d6cc47-c1e8-4702-bb86-2d369ab71d22 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 941.567355] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-f9d6cc47-c1e8-4702-bb86-2d369ab71d22 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 941.567789] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-22c86cdf-9669-4480-b1b6-35089878673f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 941.580157] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-352ff7ea-1753-4279-a58f-ea9a20f8ce12 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 941.592609] env[68617]: DEBUG nova.compute.manager [None req-8673ebc2-02bf-4b52-87c4-05d73ef56ad6 tempest-VolumesAdminNegativeTest-561724217 tempest-VolumesAdminNegativeTest-561724217-project-member] [instance: 152f9e1d-dd1b-486f-94b8-8202c0f2d335] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 941.614270] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-f9d6cc47-c1e8-4702-bb86-2d369ab71d22 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b95883b2-0366-4f52-bdf2-aa6259fafc58 could not be found. [ 941.614488] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-f9d6cc47-c1e8-4702-bb86-2d369ab71d22 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 941.614667] env[68617]: INFO nova.compute.manager [None req-f9d6cc47-c1e8-4702-bb86-2d369ab71d22 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Took 0.05 seconds to destroy the instance on the hypervisor. [ 941.614908] env[68617]: DEBUG oslo.service.loopingcall [None req-f9d6cc47-c1e8-4702-bb86-2d369ab71d22 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 941.615275] env[68617]: DEBUG nova.compute.manager [-] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 941.615373] env[68617]: DEBUG nova.network.neutron [-] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 941.632604] env[68617]: DEBUG nova.compute.manager [None req-8673ebc2-02bf-4b52-87c4-05d73ef56ad6 tempest-VolumesAdminNegativeTest-561724217 tempest-VolumesAdminNegativeTest-561724217-project-member] [instance: 152f9e1d-dd1b-486f-94b8-8202c0f2d335] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 941.644452] env[68617]: DEBUG nova.network.neutron [-] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 941.654339] env[68617]: INFO nova.compute.manager [-] [instance: b95883b2-0366-4f52-bdf2-aa6259fafc58] Took 0.04 seconds to deallocate network for instance. [ 941.669693] env[68617]: DEBUG oslo_concurrency.lockutils [None req-8673ebc2-02bf-4b52-87c4-05d73ef56ad6 tempest-VolumesAdminNegativeTest-561724217 tempest-VolumesAdminNegativeTest-561724217-project-member] Lock "152f9e1d-dd1b-486f-94b8-8202c0f2d335" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 227.637s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 941.689359] env[68617]: DEBUG nova.compute.manager [None req-cc7683d3-a14d-40a7-9a16-b134b3aec2f0 tempest-ServerShowV254Test-70051388 tempest-ServerShowV254Test-70051388-project-member] [instance: 9d10a63c-4c97-48c3-aca8-fd317aa2fbe7] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 941.740547] env[68617]: DEBUG nova.compute.manager [None req-cc7683d3-a14d-40a7-9a16-b134b3aec2f0 tempest-ServerShowV254Test-70051388 tempest-ServerShowV254Test-70051388-project-member] [instance: 9d10a63c-4c97-48c3-aca8-fd317aa2fbe7] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 941.741313] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 941.742091] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Starting heal instance info cache {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 941.742091] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Rebuilding the list of instances to heal {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 941.771455] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 941.771747] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 941.771994] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 941.772284] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 941.772518] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 941.772775] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 941.772974] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 941.773516] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 941.773516] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 941.773654] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Didn't find any instances for network info cache update. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 941.775234] env[68617]: DEBUG oslo_concurrency.lockutils [None req-cc7683d3-a14d-40a7-9a16-b134b3aec2f0 tempest-ServerShowV254Test-70051388 tempest-ServerShowV254Test-70051388-project-member] Lock "9d10a63c-4c97-48c3-aca8-fd317aa2fbe7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 226.817s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 941.791920] env[68617]: DEBUG nova.compute.manager [None req-5f4b9570-f02c-433a-af31-90737f90adf7 tempest-ImagesNegativeTestJSON-780312143 tempest-ImagesNegativeTestJSON-780312143-project-member] [instance: e6e6c910-9485-48b0-bffa-4534cd7f87d4] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 941.834880] env[68617]: DEBUG nova.compute.manager [None req-5f4b9570-f02c-433a-af31-90737f90adf7 tempest-ImagesNegativeTestJSON-780312143 tempest-ImagesNegativeTestJSON-780312143-project-member] [instance: e6e6c910-9485-48b0-bffa-4534cd7f87d4] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 941.844406] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f9d6cc47-c1e8-4702-bb86-2d369ab71d22 tempest-ServerPasswordTestJSON-1098303230 tempest-ServerPasswordTestJSON-1098303230-project-member] Lock "b95883b2-0366-4f52-bdf2-aa6259fafc58" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.284s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 941.871585] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5f4b9570-f02c-433a-af31-90737f90adf7 tempest-ImagesNegativeTestJSON-780312143 tempest-ImagesNegativeTestJSON-780312143-project-member] Lock "e6e6c910-9485-48b0-bffa-4534cd7f87d4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 220.291s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 941.888904] env[68617]: DEBUG nova.compute.manager [None req-87039a0c-e740-428e-b486-a4fc387bd6d0 tempest-ServersTestMultiNic-884689889 tempest-ServersTestMultiNic-884689889-project-member] [instance: 1ec954d1-1bc9-4db3-9a48-7da759cebf21] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 941.932040] env[68617]: DEBUG nova.compute.manager [None req-87039a0c-e740-428e-b486-a4fc387bd6d0 tempest-ServersTestMultiNic-884689889 tempest-ServersTestMultiNic-884689889-project-member] [instance: 1ec954d1-1bc9-4db3-9a48-7da759cebf21] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 941.999011] env[68617]: DEBUG oslo_concurrency.lockutils [None req-87039a0c-e740-428e-b486-a4fc387bd6d0 tempest-ServersTestMultiNic-884689889 tempest-ServersTestMultiNic-884689889-project-member] Lock "1ec954d1-1bc9-4db3-9a48-7da759cebf21" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 219.124s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 942.014197] env[68617]: DEBUG nova.compute.manager [None req-ac06e194-790a-4b22-a986-6f23dcf296da tempest-VolumesAssistedSnapshotsTest-1080420425 tempest-VolumesAssistedSnapshotsTest-1080420425-project-member] [instance: 7d51d3c0-12fd-4118-80c6-16c1cca346db] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 942.058283] env[68617]: DEBUG nova.compute.manager [None req-ac06e194-790a-4b22-a986-6f23dcf296da tempest-VolumesAssistedSnapshotsTest-1080420425 tempest-VolumesAssistedSnapshotsTest-1080420425-project-member] [instance: 7d51d3c0-12fd-4118-80c6-16c1cca346db] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 942.085834] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ac06e194-790a-4b22-a986-6f23dcf296da tempest-VolumesAssistedSnapshotsTest-1080420425 tempest-VolumesAssistedSnapshotsTest-1080420425-project-member] Lock "7d51d3c0-12fd-4118-80c6-16c1cca346db" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 218.670s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 942.110192] env[68617]: DEBUG oslo_concurrency.lockutils [None req-8fbf2960-ea12-4872-8e64-f7390800461a tempest-ServerRescueNegativeTestJSON-1564947093 tempest-ServerRescueNegativeTestJSON-1564947093-project-member] Acquiring lock "c9e6a9e1-6479-47ba-ae12-0441d2761bb6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 942.110397] env[68617]: DEBUG oslo_concurrency.lockutils [None req-8fbf2960-ea12-4872-8e64-f7390800461a tempest-ServerRescueNegativeTestJSON-1564947093 tempest-ServerRescueNegativeTestJSON-1564947093-project-member] Lock "c9e6a9e1-6479-47ba-ae12-0441d2761bb6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 942.110661] env[68617]: DEBUG nova.compute.manager [None req-5d634d33-51a2-4f85-8593-a8501573f884 tempest-SecurityGroupsTestJSON-1069621129 tempest-SecurityGroupsTestJSON-1069621129-project-member] [instance: a43cf82a-c969-47eb-b8dc-d7fe7f7870d3] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 942.143380] env[68617]: DEBUG nova.compute.manager [None req-5d634d33-51a2-4f85-8593-a8501573f884 tempest-SecurityGroupsTestJSON-1069621129 tempest-SecurityGroupsTestJSON-1069621129-project-member] [instance: a43cf82a-c969-47eb-b8dc-d7fe7f7870d3] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 942.174356] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5d634d33-51a2-4f85-8593-a8501573f884 tempest-SecurityGroupsTestJSON-1069621129 tempest-SecurityGroupsTestJSON-1069621129-project-member] Lock "a43cf82a-c969-47eb-b8dc-d7fe7f7870d3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 216.584s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 942.188633] env[68617]: DEBUG nova.compute.manager [None req-94dbeb25-06a4-414d-8622-f98526281db2 tempest-ServersWithSpecificFlavorTestJSON-843794936 tempest-ServersWithSpecificFlavorTestJSON-843794936-project-member] [instance: 40c6521b-51d9-45cf-959c-21e4f3da7eb9] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 942.217908] env[68617]: DEBUG nova.compute.manager [None req-94dbeb25-06a4-414d-8622-f98526281db2 tempest-ServersWithSpecificFlavorTestJSON-843794936 tempest-ServersWithSpecificFlavorTestJSON-843794936-project-member] [instance: 40c6521b-51d9-45cf-959c-21e4f3da7eb9] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 942.247496] env[68617]: DEBUG oslo_concurrency.lockutils [None req-94dbeb25-06a4-414d-8622-f98526281db2 tempest-ServersWithSpecificFlavorTestJSON-843794936 tempest-ServersWithSpecificFlavorTestJSON-843794936-project-member] Lock "40c6521b-51d9-45cf-959c-21e4f3da7eb9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.628s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 942.252601] env[68617]: DEBUG oslo_concurrency.lockutils [None req-b69cc1c6-c860-42b7-9835-b29069b7969d tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Acquiring lock "96bc8135-1233-4569-99ce-c7a529b96d11" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 942.252876] env[68617]: DEBUG oslo_concurrency.lockutils [None req-b69cc1c6-c860-42b7-9835-b29069b7969d tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Lock "96bc8135-1233-4569-99ce-c7a529b96d11" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 942.257394] env[68617]: DEBUG nova.compute.manager [None req-ec57737b-ea16-4b83-9bb4-b9d43e9aef52 tempest-MigrationsAdminTest-1112293401 tempest-MigrationsAdminTest-1112293401-project-member] [instance: dae068af-0c54-4715-bdc3-ecfd018b6294] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 942.286764] env[68617]: DEBUG nova.compute.manager [None req-ec57737b-ea16-4b83-9bb4-b9d43e9aef52 tempest-MigrationsAdminTest-1112293401 tempest-MigrationsAdminTest-1112293401-project-member] [instance: dae068af-0c54-4715-bdc3-ecfd018b6294] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 942.307891] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ec57737b-ea16-4b83-9bb4-b9d43e9aef52 tempest-MigrationsAdminTest-1112293401 tempest-MigrationsAdminTest-1112293401-project-member] Lock "dae068af-0c54-4715-bdc3-ecfd018b6294" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 211.310s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 942.319279] env[68617]: DEBUG nova.compute.manager [None req-ae66108a-45b4-4490-93f1-08beff341940 tempest-InstanceActionsNegativeTestJSON-1283463967 tempest-InstanceActionsNegativeTestJSON-1283463967-project-member] [instance: ee6e18cd-9af2-4440-8336-9e1858c28709] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 942.343306] env[68617]: DEBUG nova.compute.manager [None req-ae66108a-45b4-4490-93f1-08beff341940 tempest-InstanceActionsNegativeTestJSON-1283463967 tempest-InstanceActionsNegativeTestJSON-1283463967-project-member] [instance: ee6e18cd-9af2-4440-8336-9e1858c28709] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 942.366639] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ae66108a-45b4-4490-93f1-08beff341940 tempest-InstanceActionsNegativeTestJSON-1283463967 tempest-InstanceActionsNegativeTestJSON-1283463967-project-member] Lock "ee6e18cd-9af2-4440-8336-9e1858c28709" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 210.921s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 942.384158] env[68617]: DEBUG nova.compute.manager [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 942.438675] env[68617]: DEBUG oslo_concurrency.lockutils [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 942.439041] env[68617]: DEBUG oslo_concurrency.lockutils [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 942.440736] env[68617]: INFO nova.compute.claims [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 942.984017] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a27b8da-97ab-4edf-a734-c698a31097f8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 942.989292] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-529bacb2-681d-45f1-b79d-524b021bf1c1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 943.022554] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe2a19d4-a310-41b1-8776-83ddc24264ab {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 943.033757] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f65bfb20-c036-494e-b765-23193c090865 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 943.048025] env[68617]: DEBUG nova.compute.provider_tree [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 943.060347] env[68617]: DEBUG nova.scheduler.client.report [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 943.081476] env[68617]: DEBUG oslo_concurrency.lockutils [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.642s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 943.081873] env[68617]: DEBUG nova.compute.manager [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 943.128988] env[68617]: DEBUG nova.compute.utils [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 943.130322] env[68617]: DEBUG nova.compute.manager [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 943.130500] env[68617]: DEBUG nova.network.neutron [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 943.142068] env[68617]: DEBUG nova.compute.manager [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 943.211506] env[68617]: DEBUG nova.policy [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '11eecc8f059e410cb97bafaadc378f89', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4de7b27e9cf04c16b8dee80e756404fd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 943.224240] env[68617]: DEBUG nova.compute.manager [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 943.254514] env[68617]: DEBUG nova.virt.hardware [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 943.254784] env[68617]: DEBUG nova.virt.hardware [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 943.254892] env[68617]: DEBUG nova.virt.hardware [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 943.255078] env[68617]: DEBUG nova.virt.hardware [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 943.255490] env[68617]: DEBUG nova.virt.hardware [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 943.255490] env[68617]: DEBUG nova.virt.hardware [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 943.255589] env[68617]: DEBUG nova.virt.hardware [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 943.256055] env[68617]: DEBUG nova.virt.hardware [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 943.256055] env[68617]: DEBUG nova.virt.hardware [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 943.256055] env[68617]: DEBUG nova.virt.hardware [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 943.256215] env[68617]: DEBUG nova.virt.hardware [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 943.257119] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e0d8a54-713e-4ad6-9ea3-769adf958694 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 943.267473] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a188221-651b-46c8-8346-0ce1e500b992 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 943.562842] env[68617]: DEBUG nova.network.neutron [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Successfully created port: 55278f04-c477-4777-b662-6afbaf96a16e {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 944.699627] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 944.699968] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 944.700067] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 944.700223] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 945.698590] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 945.698826] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 945.698972] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68617) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 946.513746] env[68617]: DEBUG nova.network.neutron [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Successfully updated port: 55278f04-c477-4777-b662-6afbaf96a16e {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 946.513954] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Acquiring lock "f560b4df-fb57-4f7b-8a8b-53325970e06e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 946.514218] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Lock "f560b4df-fb57-4f7b-8a8b-53325970e06e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 946.530543] env[68617]: DEBUG oslo_concurrency.lockutils [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "refresh_cache-995585f5-57a4-4ba6-9e28-18a086af264c" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 946.530699] env[68617]: DEBUG oslo_concurrency.lockutils [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquired lock "refresh_cache-995585f5-57a4-4ba6-9e28-18a086af264c" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 946.531089] env[68617]: DEBUG nova.network.neutron [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 946.576237] env[68617]: DEBUG nova.network.neutron [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 946.698790] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 946.713339] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 946.713557] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 946.713715] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 946.713866] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 946.715306] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a1c4ea5-7019-4bd8-bd39-2476404efea0 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 946.726388] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58029f5d-5713-4048-987b-753d2454e1f6 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 946.746534] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9a437d4-25b9-4f72-a698-2ac1bcd91416 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 946.754986] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e09f5503-03a9-4412-b59b-44e803bd9141 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 946.789027] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180925MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 946.789205] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 946.789414] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 946.837399] env[68617]: DEBUG oslo_concurrency.lockutils [None req-07b182b3-189d-4235-affd-8500e6424c2c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "995585f5-57a4-4ba6-9e28-18a086af264c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 946.847049] env[68617]: DEBUG nova.compute.manager [req-49fde391-f613-46c1-b8e6-9caf192aa3cb req-c386a8c5-16ce-488b-bf5c-3967fa069712 service nova] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Received event network-vif-plugged-55278f04-c477-4777-b662-6afbaf96a16e {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 946.847049] env[68617]: DEBUG oslo_concurrency.lockutils [req-49fde391-f613-46c1-b8e6-9caf192aa3cb req-c386a8c5-16ce-488b-bf5c-3967fa069712 service nova] Acquiring lock "995585f5-57a4-4ba6-9e28-18a086af264c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 946.847049] env[68617]: DEBUG oslo_concurrency.lockutils [req-49fde391-f613-46c1-b8e6-9caf192aa3cb req-c386a8c5-16ce-488b-bf5c-3967fa069712 service nova] Lock "995585f5-57a4-4ba6-9e28-18a086af264c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 946.847049] env[68617]: DEBUG oslo_concurrency.lockutils [req-49fde391-f613-46c1-b8e6-9caf192aa3cb req-c386a8c5-16ce-488b-bf5c-3967fa069712 service nova] Lock "995585f5-57a4-4ba6-9e28-18a086af264c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 946.847221] env[68617]: DEBUG nova.compute.manager [req-49fde391-f613-46c1-b8e6-9caf192aa3cb req-c386a8c5-16ce-488b-bf5c-3967fa069712 service nova] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] No waiting events found dispatching network-vif-plugged-55278f04-c477-4777-b662-6afbaf96a16e {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 946.847221] env[68617]: WARNING nova.compute.manager [req-49fde391-f613-46c1-b8e6-9caf192aa3cb req-c386a8c5-16ce-488b-bf5c-3967fa069712 service nova] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Received unexpected event network-vif-plugged-55278f04-c477-4777-b662-6afbaf96a16e for instance with vm_state building and task_state deleting. [ 946.939769] env[68617]: DEBUG nova.network.neutron [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Updating instance_info_cache with network_info: [{"id": "55278f04-c477-4777-b662-6afbaf96a16e", "address": "fa:16:3e:78:8d:6e", "network": {"id": "ad29e76d-388b-42ca-9526-7b6c236321e3", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1855301645-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4de7b27e9cf04c16b8dee80e756404fd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8e272539-d425-489f-9a63-aba692e88933", "external-id": "nsx-vlan-transportzone-869", "segmentation_id": 869, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap55278f04-c4", "ovs_interfaceid": "55278f04-c477-4777-b662-6afbaf96a16e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 946.952591] env[68617]: DEBUG oslo_concurrency.lockutils [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Releasing lock "refresh_cache-995585f5-57a4-4ba6-9e28-18a086af264c" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 946.952842] env[68617]: DEBUG nova.compute.manager [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Instance network_info: |[{"id": "55278f04-c477-4777-b662-6afbaf96a16e", "address": "fa:16:3e:78:8d:6e", "network": {"id": "ad29e76d-388b-42ca-9526-7b6c236321e3", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1855301645-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4de7b27e9cf04c16b8dee80e756404fd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8e272539-d425-489f-9a63-aba692e88933", "external-id": "nsx-vlan-transportzone-869", "segmentation_id": 869, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap55278f04-c4", "ovs_interfaceid": "55278f04-c477-4777-b662-6afbaf96a16e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 946.953287] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:78:8d:6e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8e272539-d425-489f-9a63-aba692e88933', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '55278f04-c477-4777-b662-6afbaf96a16e', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 946.963501] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Creating folder: Project (4de7b27e9cf04c16b8dee80e756404fd). Parent ref: group-v693691. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 946.964644] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 050e2b27-1311-4a9a-b5cf-6bc2f7128eba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 946.964772] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f13242a0-7e65-4d68-a317-16fb8c4b8f8a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 946.965934] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 3b95678b-dfc5-4610-a51e-2ae12fbe274b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 946.965934] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6300077d-5aa7-4794-8ba2-1ec30151c15c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 946.965934] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 4ea5887f-84bd-4629-b568-e73c78af0ad4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 946.965934] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6eef6e24-cf49-458b-ae37-8da4e02045f8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 946.966133] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 71b1ebba-2019-4378-9bd2-98a7559c22e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 946.966133] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance e6b6cbdd-11d6-44a6-8da7-98e0f52cef67 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 946.966133] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b27ace75-e2fa-4acc-96cb-88dd49b89de5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 946.966133] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 995585f5-57a4-4ba6-9e28-18a086af264c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 946.967425] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0bbd0dd7-3b29-450e-8e8a-5ca807903ad9 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 946.978585] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 53abf4e7-35ea-415b-8a90-a89442c475a1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 946.981528] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Created folder: Project (4de7b27e9cf04c16b8dee80e756404fd) in parent group-v693691. [ 946.981612] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Creating folder: Instances. Parent ref: group-v693734. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 946.982312] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0feb30cc-59c9-4a6b-904c-881971590b0c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 946.996897] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Created folder: Instances in parent group-v693734. [ 946.997149] env[68617]: DEBUG oslo.service.loopingcall [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 946.997852] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 946.998470] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 8d0a643a-96c5-4d47-aa2a-9f777e80c259 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 946.999544] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1022cc31-bc2c-414f-a2e2-53908787438e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 947.017635] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6341f8b6-9f42-4ab2-806f-dbad62de5376 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 947.025194] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 947.025194] env[68617]: value = "task-3470750" [ 947.025194] env[68617]: _type = "Task" [ 947.025194] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 947.029634] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance cc32959e-71ea-44cb-aebe-bf6a893ebb18 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 947.035449] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470750, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 947.043260] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 704e4d83-19ef-493a-a374-dce0de95e975 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 947.055893] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance df9a6dc4-abb5-4855-ac4f-5479dd0b6498 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 947.068341] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f6f64438-8279-4ff4-ab80-efd1e17d7e04 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 947.079055] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f2bef6cc-f5e2-41a8-b377-31f016746257 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 947.090762] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance fe74e8d8-e439-4834-9721-08d9e64c7740 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 947.108780] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 82864ac3-a199-478c-8c57-97ea0a256201 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 947.122773] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 947.136146] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 39f1e776-4df9-4b24-a51b-c1a15a943a76 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 947.153969] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 79c92a1b-20ef-4360-93b4-913cbfcf92fe has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 947.170239] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 40de8cd1-1c46-4ffb-866b-255386fe44b6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 947.184988] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 1cc76382-5452-4ed4-bb99-c6800c70d42a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 947.203638] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 3258d1a5-7142-4e06-814d-e68fd90262ae has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 947.214111] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance e3a2fb7d-b092-485f-b64a-486c458ba845 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 947.230415] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance eaeae56d-8e71-43bc-8441-49a29c161763 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 947.246786] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 79d8a532-b071-4c79-8c5d-f08438928201 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 947.259195] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance c9e6a9e1-6479-47ba-ae12-0441d2761bb6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 947.271304] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 96bc8135-1233-4569-99ce-c7a529b96d11 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 947.281680] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f560b4df-fb57-4f7b-8a8b-53325970e06e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 947.281939] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 947.282100] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1856MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 947.303056] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Refreshing inventories for resource provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 947.320356] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Updating ProviderTree inventory for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 947.320723] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Updating inventory in ProviderTree for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 947.335080] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Refreshing aggregate associations for resource provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f, aggregates: None {{(pid=68617) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 947.360358] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Refreshing trait associations for resource provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_VMDK {{(pid=68617) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 947.535916] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470750, 'name': CreateVM_Task, 'duration_secs': 0.40076} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 947.536197] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 947.537103] env[68617]: DEBUG oslo_concurrency.lockutils [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 947.537103] env[68617]: DEBUG oslo_concurrency.lockutils [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 947.537308] env[68617]: DEBUG oslo_concurrency.lockutils [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 947.537542] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-983a070e-176e-4c43-bb67-0f8e526dd000 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 947.545348] env[68617]: DEBUG oslo_vmware.api [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Waiting for the task: (returnval){ [ 947.545348] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52498c7f-e4fe-a71c-aceb-fe13ee79c27e" [ 947.545348] env[68617]: _type = "Task" [ 947.545348] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 947.553595] env[68617]: DEBUG oslo_vmware.api [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52498c7f-e4fe-a71c-aceb-fe13ee79c27e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 947.785625] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75b7ed29-659b-421d-8c1f-2424fd6c5e74 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 947.793578] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6dc1a949-e90b-4d4b-805e-3c6de93e0e82 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 947.830444] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56722b62-6641-470b-8541-107349e60eaf {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 947.842208] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fb78c3b-6198-48e2-81eb-24d5e57e00d8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 947.856444] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 947.869275] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 947.893556] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 947.893841] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.104s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 948.055212] env[68617]: DEBUG oslo_concurrency.lockutils [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 948.055537] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 948.055707] env[68617]: DEBUG oslo_concurrency.lockutils [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 948.894769] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 949.133753] env[68617]: DEBUG nova.compute.manager [req-dfdd82ba-aa03-4197-8b84-8d8373c7edbf req-6d7bcb96-ba4b-4dd1-abac-360bd51fb0fa service nova] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Received event network-changed-55278f04-c477-4777-b662-6afbaf96a16e {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 949.133990] env[68617]: DEBUG nova.compute.manager [req-dfdd82ba-aa03-4197-8b84-8d8373c7edbf req-6d7bcb96-ba4b-4dd1-abac-360bd51fb0fa service nova] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Refreshing instance network info cache due to event network-changed-55278f04-c477-4777-b662-6afbaf96a16e. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 949.134158] env[68617]: DEBUG oslo_concurrency.lockutils [req-dfdd82ba-aa03-4197-8b84-8d8373c7edbf req-6d7bcb96-ba4b-4dd1-abac-360bd51fb0fa service nova] Acquiring lock "refresh_cache-995585f5-57a4-4ba6-9e28-18a086af264c" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 949.136769] env[68617]: DEBUG oslo_concurrency.lockutils [req-dfdd82ba-aa03-4197-8b84-8d8373c7edbf req-6d7bcb96-ba4b-4dd1-abac-360bd51fb0fa service nova] Acquired lock "refresh_cache-995585f5-57a4-4ba6-9e28-18a086af264c" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 949.136769] env[68617]: DEBUG nova.network.neutron [req-dfdd82ba-aa03-4197-8b84-8d8373c7edbf req-6d7bcb96-ba4b-4dd1-abac-360bd51fb0fa service nova] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Refreshing network info cache for port 55278f04-c477-4777-b662-6afbaf96a16e {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 950.033667] env[68617]: DEBUG nova.network.neutron [req-dfdd82ba-aa03-4197-8b84-8d8373c7edbf req-6d7bcb96-ba4b-4dd1-abac-360bd51fb0fa service nova] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Updated VIF entry in instance network info cache for port 55278f04-c477-4777-b662-6afbaf96a16e. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 950.035424] env[68617]: DEBUG nova.network.neutron [req-dfdd82ba-aa03-4197-8b84-8d8373c7edbf req-6d7bcb96-ba4b-4dd1-abac-360bd51fb0fa service nova] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Updating instance_info_cache with network_info: [{"id": "55278f04-c477-4777-b662-6afbaf96a16e", "address": "fa:16:3e:78:8d:6e", "network": {"id": "ad29e76d-388b-42ca-9526-7b6c236321e3", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1855301645-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4de7b27e9cf04c16b8dee80e756404fd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8e272539-d425-489f-9a63-aba692e88933", "external-id": "nsx-vlan-transportzone-869", "segmentation_id": 869, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap55278f04-c4", "ovs_interfaceid": "55278f04-c477-4777-b662-6afbaf96a16e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 950.045433] env[68617]: DEBUG oslo_concurrency.lockutils [req-dfdd82ba-aa03-4197-8b84-8d8373c7edbf req-6d7bcb96-ba4b-4dd1-abac-360bd51fb0fa service nova] Releasing lock "refresh_cache-995585f5-57a4-4ba6-9e28-18a086af264c" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 954.842713] env[68617]: DEBUG oslo_concurrency.lockutils [None req-75f49065-4ad5-4a89-990d-f4fdd6a5a0a5 tempest-AttachInterfacesTestJSON-753337404 tempest-AttachInterfacesTestJSON-753337404-project-member] Acquiring lock "e4ac9902-3e8b-4790-a00b-2fd45f16ff63" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 954.843318] env[68617]: DEBUG oslo_concurrency.lockutils [None req-75f49065-4ad5-4a89-990d-f4fdd6a5a0a5 tempest-AttachInterfacesTestJSON-753337404 tempest-AttachInterfacesTestJSON-753337404-project-member] Lock "e4ac9902-3e8b-4790-a00b-2fd45f16ff63" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 956.664649] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e0dddcde-5965-4448-b33a-0c88fdb64fe2 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquiring lock "85bfa486-9f65-40d6-a392-54fdf87da1a1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 956.664649] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e0dddcde-5965-4448-b33a-0c88fdb64fe2 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Lock "85bfa486-9f65-40d6-a392-54fdf87da1a1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 958.325728] env[68617]: DEBUG oslo_concurrency.lockutils [None req-31397e9d-74f9-4277-b762-6589b4c28702 tempest-ImagesTestJSON-918330909 tempest-ImagesTestJSON-918330909-project-member] Acquiring lock "f63c673e-40dc-49d3-b356-85629ada1101" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 958.326033] env[68617]: DEBUG oslo_concurrency.lockutils [None req-31397e9d-74f9-4277-b762-6589b4c28702 tempest-ImagesTestJSON-918330909 tempest-ImagesTestJSON-918330909-project-member] Lock "f63c673e-40dc-49d3-b356-85629ada1101" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 959.069751] env[68617]: DEBUG oslo_concurrency.lockutils [None req-399a0f64-0267-46ff-b331-fe461e69dc70 tempest-ServerActionsTestJSON-789019370 tempest-ServerActionsTestJSON-789019370-project-member] Acquiring lock "d3b6336e-4baa-426e-a31d-9788cd2131a0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 959.069965] env[68617]: DEBUG oslo_concurrency.lockutils [None req-399a0f64-0267-46ff-b331-fe461e69dc70 tempest-ServerActionsTestJSON-789019370 tempest-ServerActionsTestJSON-789019370-project-member] Lock "d3b6336e-4baa-426e-a31d-9788cd2131a0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 960.606934] env[68617]: DEBUG oslo_concurrency.lockutils [None req-12c64557-bf69-4e0e-af49-57136b751ce7 tempest-ServersNegativeTestJSON-272895408 tempest-ServersNegativeTestJSON-272895408-project-member] Acquiring lock "21c0de14-cb70-4a41-954f-aaa904d1514a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 960.608236] env[68617]: DEBUG oslo_concurrency.lockutils [None req-12c64557-bf69-4e0e-af49-57136b751ce7 tempest-ServersNegativeTestJSON-272895408 tempest-ServersNegativeTestJSON-272895408-project-member] Lock "21c0de14-cb70-4a41-954f-aaa904d1514a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 963.990192] env[68617]: DEBUG oslo_concurrency.lockutils [None req-46c98673-4183-49ef-95be-7e9c465a3475 tempest-InstanceActionsV221TestJSON-2063899890 tempest-InstanceActionsV221TestJSON-2063899890-project-member] Acquiring lock "b5a088c8-429a-49b3-b330-315d15ace97f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 963.990549] env[68617]: DEBUG oslo_concurrency.lockutils [None req-46c98673-4183-49ef-95be-7e9c465a3475 tempest-InstanceActionsV221TestJSON-2063899890 tempest-InstanceActionsV221TestJSON-2063899890-project-member] Lock "b5a088c8-429a-49b3-b330-315d15ace97f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 987.693181] env[68617]: WARNING oslo_vmware.rw_handles [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 987.693181] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 987.693181] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 987.693181] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 987.693181] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 987.693181] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 987.693181] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 987.693181] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 987.693181] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 987.693181] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 987.693181] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 987.693181] env[68617]: ERROR oslo_vmware.rw_handles [ 987.693862] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/2ef72850-9e1a-4f26-954d-f297e01eeb9f/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 987.695546] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 987.695836] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Copying Virtual Disk [datastore2] vmware_temp/2ef72850-9e1a-4f26-954d-f297e01eeb9f/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/2ef72850-9e1a-4f26-954d-f297e01eeb9f/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 987.696183] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-b832e89e-e2bd-4d69-ad4e-842326342704 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 987.704165] env[68617]: DEBUG oslo_vmware.api [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Waiting for the task: (returnval){ [ 987.704165] env[68617]: value = "task-3470751" [ 987.704165] env[68617]: _type = "Task" [ 987.704165] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 987.712262] env[68617]: DEBUG oslo_vmware.api [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Task: {'id': task-3470751, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 988.214762] env[68617]: DEBUG oslo_vmware.exceptions [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 988.215120] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 988.215669] env[68617]: ERROR nova.compute.manager [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 988.215669] env[68617]: Faults: ['InvalidArgument'] [ 988.215669] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Traceback (most recent call last): [ 988.215669] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 988.215669] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] yield resources [ 988.215669] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 988.215669] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] self.driver.spawn(context, instance, image_meta, [ 988.215669] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 988.215669] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 988.215669] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 988.215669] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] self._fetch_image_if_missing(context, vi) [ 988.215669] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 988.217313] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] image_cache(vi, tmp_image_ds_loc) [ 988.217313] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 988.217313] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] vm_util.copy_virtual_disk( [ 988.217313] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 988.217313] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] session._wait_for_task(vmdk_copy_task) [ 988.217313] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 988.217313] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] return self.wait_for_task(task_ref) [ 988.217313] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 988.217313] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] return evt.wait() [ 988.217313] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 988.217313] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] result = hub.switch() [ 988.217313] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 988.217313] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] return self.greenlet.switch() [ 988.217717] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 988.217717] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] self.f(*self.args, **self.kw) [ 988.217717] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 988.217717] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] raise exceptions.translate_fault(task_info.error) [ 988.217717] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 988.217717] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Faults: ['InvalidArgument'] [ 988.217717] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] [ 988.217717] env[68617]: INFO nova.compute.manager [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Terminating instance [ 988.217717] env[68617]: DEBUG oslo_concurrency.lockutils [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 988.217979] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 988.218087] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1857a938-c96a-4cf5-ad26-e660fa27b4c6 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 988.220163] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Acquiring lock "refresh_cache-3b95678b-dfc5-4610-a51e-2ae12fbe274b" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 988.220318] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Acquired lock "refresh_cache-3b95678b-dfc5-4610-a51e-2ae12fbe274b" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 988.220498] env[68617]: DEBUG nova.network.neutron [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 988.227364] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 988.227566] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 988.228749] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a95ebea0-4b01-466d-9c6c-55b9e7af7b46 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 988.237037] env[68617]: DEBUG oslo_vmware.api [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Waiting for the task: (returnval){ [ 988.237037] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]527435ae-ee5a-670a-02db-471d03cb03cf" [ 988.237037] env[68617]: _type = "Task" [ 988.237037] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 988.244015] env[68617]: DEBUG oslo_vmware.api [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]527435ae-ee5a-670a-02db-471d03cb03cf, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 988.250237] env[68617]: DEBUG nova.network.neutron [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 988.314619] env[68617]: DEBUG nova.network.neutron [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 988.324409] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Releasing lock "refresh_cache-3b95678b-dfc5-4610-a51e-2ae12fbe274b" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 988.324848] env[68617]: DEBUG nova.compute.manager [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 988.325074] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 988.326180] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6519b31-c138-4148-8159-6c078d7db036 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 988.334150] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 988.334363] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c883349a-ad79-4834-b1d4-8cf3ea00a5be {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 988.363493] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 988.363880] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 988.364087] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Deleting the datastore file [datastore2] 3b95678b-dfc5-4610-a51e-2ae12fbe274b {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 988.364441] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-696c426c-dcab-45d7-85ca-fb790f56ba0d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 988.370314] env[68617]: DEBUG oslo_vmware.api [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Waiting for the task: (returnval){ [ 988.370314] env[68617]: value = "task-3470753" [ 988.370314] env[68617]: _type = "Task" [ 988.370314] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 988.377837] env[68617]: DEBUG oslo_vmware.api [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Task: {'id': task-3470753, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 988.746312] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 988.746749] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Creating directory with path [datastore2] vmware_temp/14280aec-a9c8-4af7-99a2-0bce35755808/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 988.746846] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-776c3b30-ab21-47b2-8e8e-be430436e64d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 988.757521] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Created directory with path [datastore2] vmware_temp/14280aec-a9c8-4af7-99a2-0bce35755808/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 988.757704] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Fetch image to [datastore2] vmware_temp/14280aec-a9c8-4af7-99a2-0bce35755808/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 988.757866] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/14280aec-a9c8-4af7-99a2-0bce35755808/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 988.758550] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a276842-ece5-4621-b79d-543cce639941 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 988.764880] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35ebe64f-37d2-4761-a355-1880cdcb09fb {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 988.773695] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f0c6f88-8bbc-4491-9b2b-e9922c183529 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 988.805736] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddf83d79-c21a-472a-b193-85ed31c1946b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 988.811070] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ff83ac32-1f24-4e1d-9279-4864da385937 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 988.838864] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 988.880170] env[68617]: DEBUG oslo_vmware.api [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Task: {'id': task-3470753, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.042675} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 988.880439] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 988.880618] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 988.880793] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 988.880960] env[68617]: INFO nova.compute.manager [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Took 0.56 seconds to destroy the instance on the hypervisor. [ 988.881214] env[68617]: DEBUG oslo.service.loopingcall [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 988.881414] env[68617]: DEBUG nova.compute.manager [-] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Skipping network deallocation for instance since networking was not requested. {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 988.883528] env[68617]: DEBUG nova.compute.claims [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 988.883717] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 988.883929] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 988.940401] env[68617]: DEBUG oslo_vmware.rw_handles [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/14280aec-a9c8-4af7-99a2-0bce35755808/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 988.999881] env[68617]: DEBUG oslo_vmware.rw_handles [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 989.000079] env[68617]: DEBUG oslo_vmware.rw_handles [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/14280aec-a9c8-4af7-99a2-0bce35755808/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 989.303403] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1f6de8f-bd96-4e39-aee7-83c25911c3d4 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 989.311429] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23a20870-5884-4fa2-9e5d-dd3ebca2f803 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 989.342665] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c7cf07e-8531-4261-a1d8-ed2844aaf339 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 989.349906] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f942e265-2fd7-4482-8bf1-db36b21e6504 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 989.363313] env[68617]: DEBUG nova.compute.provider_tree [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 989.373022] env[68617]: DEBUG nova.scheduler.client.report [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 989.388857] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.504s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 989.388919] env[68617]: ERROR nova.compute.manager [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 989.388919] env[68617]: Faults: ['InvalidArgument'] [ 989.388919] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Traceback (most recent call last): [ 989.388919] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 989.388919] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] self.driver.spawn(context, instance, image_meta, [ 989.388919] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 989.388919] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 989.388919] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 989.388919] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] self._fetch_image_if_missing(context, vi) [ 989.388919] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 989.388919] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] image_cache(vi, tmp_image_ds_loc) [ 989.388919] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 989.389310] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] vm_util.copy_virtual_disk( [ 989.389310] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 989.389310] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] session._wait_for_task(vmdk_copy_task) [ 989.389310] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 989.389310] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] return self.wait_for_task(task_ref) [ 989.389310] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 989.389310] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] return evt.wait() [ 989.389310] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 989.389310] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] result = hub.switch() [ 989.389310] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 989.389310] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] return self.greenlet.switch() [ 989.389310] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 989.389310] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] self.f(*self.args, **self.kw) [ 989.389692] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 989.389692] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] raise exceptions.translate_fault(task_info.error) [ 989.389692] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 989.389692] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Faults: ['InvalidArgument'] [ 989.389692] env[68617]: ERROR nova.compute.manager [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] [ 989.389692] env[68617]: DEBUG nova.compute.utils [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 989.391078] env[68617]: DEBUG nova.compute.manager [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Build of instance 3b95678b-dfc5-4610-a51e-2ae12fbe274b was re-scheduled: A specified parameter was not correct: fileType [ 989.391078] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 989.391441] env[68617]: DEBUG nova.compute.manager [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 989.391656] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Acquiring lock "refresh_cache-3b95678b-dfc5-4610-a51e-2ae12fbe274b" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 989.391801] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Acquired lock "refresh_cache-3b95678b-dfc5-4610-a51e-2ae12fbe274b" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 989.391957] env[68617]: DEBUG nova.network.neutron [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 989.417062] env[68617]: DEBUG nova.network.neutron [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 989.484090] env[68617]: DEBUG nova.network.neutron [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 989.493541] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Releasing lock "refresh_cache-3b95678b-dfc5-4610-a51e-2ae12fbe274b" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 989.493644] env[68617]: DEBUG nova.compute.manager [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 989.493796] env[68617]: DEBUG nova.compute.manager [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Skipping network deallocation for instance since networking was not requested. {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 989.585229] env[68617]: INFO nova.scheduler.client.report [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Deleted allocations for instance 3b95678b-dfc5-4610-a51e-2ae12fbe274b [ 989.606381] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0cb5a4d9-3870-4f5c-b8da-eb7b2a1c857e tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Lock "3b95678b-dfc5-4610-a51e-2ae12fbe274b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 287.226s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 989.607491] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3688968f-917b-4fd9-807f-5df2c4a88464 tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Lock "3b95678b-dfc5-4610-a51e-2ae12fbe274b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 86.603s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 989.607726] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3688968f-917b-4fd9-807f-5df2c4a88464 tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Acquiring lock "3b95678b-dfc5-4610-a51e-2ae12fbe274b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 989.607929] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3688968f-917b-4fd9-807f-5df2c4a88464 tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Lock "3b95678b-dfc5-4610-a51e-2ae12fbe274b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 989.608106] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3688968f-917b-4fd9-807f-5df2c4a88464 tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Lock "3b95678b-dfc5-4610-a51e-2ae12fbe274b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 989.610261] env[68617]: INFO nova.compute.manager [None req-3688968f-917b-4fd9-807f-5df2c4a88464 tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Terminating instance [ 989.611884] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3688968f-917b-4fd9-807f-5df2c4a88464 tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Acquiring lock "refresh_cache-3b95678b-dfc5-4610-a51e-2ae12fbe274b" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 989.612055] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3688968f-917b-4fd9-807f-5df2c4a88464 tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Acquired lock "refresh_cache-3b95678b-dfc5-4610-a51e-2ae12fbe274b" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 989.612232] env[68617]: DEBUG nova.network.neutron [None req-3688968f-917b-4fd9-807f-5df2c4a88464 tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 989.632893] env[68617]: DEBUG nova.compute.manager [None req-f3064b87-2ca0-4b92-b4a8-1120c3d2f60d tempest-AttachInterfacesTestJSON-753337404 tempest-AttachInterfacesTestJSON-753337404-project-member] [instance: 53abf4e7-35ea-415b-8a90-a89442c475a1] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 989.638702] env[68617]: DEBUG nova.network.neutron [None req-3688968f-917b-4fd9-807f-5df2c4a88464 tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 989.656912] env[68617]: DEBUG nova.compute.manager [None req-f3064b87-2ca0-4b92-b4a8-1120c3d2f60d tempest-AttachInterfacesTestJSON-753337404 tempest-AttachInterfacesTestJSON-753337404-project-member] [instance: 53abf4e7-35ea-415b-8a90-a89442c475a1] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 989.678865] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f3064b87-2ca0-4b92-b4a8-1120c3d2f60d tempest-AttachInterfacesTestJSON-753337404 tempest-AttachInterfacesTestJSON-753337404-project-member] Lock "53abf4e7-35ea-415b-8a90-a89442c475a1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 240.767s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 989.689865] env[68617]: DEBUG nova.compute.manager [None req-86503bab-b95f-46f2-ae42-9dadcd71b7ba tempest-ServerAddressesTestJSON-1323985713 tempest-ServerAddressesTestJSON-1323985713-project-member] [instance: 8d0a643a-96c5-4d47-aa2a-9f777e80c259] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 989.700359] env[68617]: DEBUG nova.network.neutron [None req-3688968f-917b-4fd9-807f-5df2c4a88464 tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 989.709030] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3688968f-917b-4fd9-807f-5df2c4a88464 tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Releasing lock "refresh_cache-3b95678b-dfc5-4610-a51e-2ae12fbe274b" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 989.709343] env[68617]: DEBUG nova.compute.manager [None req-3688968f-917b-4fd9-807f-5df2c4a88464 tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 989.709536] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-3688968f-917b-4fd9-807f-5df2c4a88464 tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 989.710029] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-677ea9b5-0ece-4882-8a51-b78eec632076 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 989.714034] env[68617]: DEBUG nova.compute.manager [None req-86503bab-b95f-46f2-ae42-9dadcd71b7ba tempest-ServerAddressesTestJSON-1323985713 tempest-ServerAddressesTestJSON-1323985713-project-member] [instance: 8d0a643a-96c5-4d47-aa2a-9f777e80c259] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 989.720198] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f018d5f-de77-4ac2-bd77-21f8ea8ea50b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 989.740936] env[68617]: DEBUG oslo_concurrency.lockutils [None req-86503bab-b95f-46f2-ae42-9dadcd71b7ba tempest-ServerAddressesTestJSON-1323985713 tempest-ServerAddressesTestJSON-1323985713-project-member] Lock "8d0a643a-96c5-4d47-aa2a-9f777e80c259" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 239.330s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 989.752385] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-3688968f-917b-4fd9-807f-5df2c4a88464 tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3b95678b-dfc5-4610-a51e-2ae12fbe274b could not be found. [ 989.752668] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-3688968f-917b-4fd9-807f-5df2c4a88464 tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 989.752762] env[68617]: INFO nova.compute.manager [None req-3688968f-917b-4fd9-807f-5df2c4a88464 tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Took 0.04 seconds to destroy the instance on the hypervisor. [ 989.752979] env[68617]: DEBUG oslo.service.loopingcall [None req-3688968f-917b-4fd9-807f-5df2c4a88464 tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 989.753455] env[68617]: DEBUG nova.compute.manager [None req-c22f9f6b-d3b5-4e1b-a997-e31bb568d743 tempest-ServersTestJSON-1804119009 tempest-ServersTestJSON-1804119009-project-member] [instance: 6341f8b6-9f42-4ab2-806f-dbad62de5376] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 989.755837] env[68617]: DEBUG nova.compute.manager [-] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 989.755936] env[68617]: DEBUG nova.network.neutron [-] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 989.773177] env[68617]: DEBUG nova.network.neutron [-] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 989.779573] env[68617]: DEBUG nova.compute.manager [None req-c22f9f6b-d3b5-4e1b-a997-e31bb568d743 tempest-ServersTestJSON-1804119009 tempest-ServersTestJSON-1804119009-project-member] [instance: 6341f8b6-9f42-4ab2-806f-dbad62de5376] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 989.783911] env[68617]: DEBUG nova.network.neutron [-] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 989.791979] env[68617]: INFO nova.compute.manager [-] [instance: 3b95678b-dfc5-4610-a51e-2ae12fbe274b] Took 0.04 seconds to deallocate network for instance. [ 989.801073] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c22f9f6b-d3b5-4e1b-a997-e31bb568d743 tempest-ServersTestJSON-1804119009 tempest-ServersTestJSON-1804119009-project-member] Lock "6341f8b6-9f42-4ab2-806f-dbad62de5376" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 238.587s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 989.810410] env[68617]: DEBUG nova.compute.manager [None req-be884208-027a-4bdd-8f4b-0274fa0ae816 tempest-TenantUsagesTestJSON-2135587497 tempest-TenantUsagesTestJSON-2135587497-project-member] [instance: cc32959e-71ea-44cb-aebe-bf6a893ebb18] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 989.842140] env[68617]: DEBUG nova.compute.manager [None req-be884208-027a-4bdd-8f4b-0274fa0ae816 tempest-TenantUsagesTestJSON-2135587497 tempest-TenantUsagesTestJSON-2135587497-project-member] [instance: cc32959e-71ea-44cb-aebe-bf6a893ebb18] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 989.867870] env[68617]: DEBUG oslo_concurrency.lockutils [None req-be884208-027a-4bdd-8f4b-0274fa0ae816 tempest-TenantUsagesTestJSON-2135587497 tempest-TenantUsagesTestJSON-2135587497-project-member] Lock "cc32959e-71ea-44cb-aebe-bf6a893ebb18" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 238.441s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 989.878321] env[68617]: DEBUG nova.compute.manager [None req-df3c1748-33bf-4848-8e70-a516d8f991e9 tempest-ImagesOneServerNegativeTestJSON-249197895 tempest-ImagesOneServerNegativeTestJSON-249197895-project-member] [instance: 704e4d83-19ef-493a-a374-dce0de95e975] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 989.904277] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3688968f-917b-4fd9-807f-5df2c4a88464 tempest-ServerDiagnosticsV248Test-305417211 tempest-ServerDiagnosticsV248Test-305417211-project-member] Lock "3b95678b-dfc5-4610-a51e-2ae12fbe274b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.297s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 989.906325] env[68617]: DEBUG nova.compute.manager [None req-df3c1748-33bf-4848-8e70-a516d8f991e9 tempest-ImagesOneServerNegativeTestJSON-249197895 tempest-ImagesOneServerNegativeTestJSON-249197895-project-member] [instance: 704e4d83-19ef-493a-a374-dce0de95e975] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 989.929874] env[68617]: DEBUG oslo_concurrency.lockutils [None req-df3c1748-33bf-4848-8e70-a516d8f991e9 tempest-ImagesOneServerNegativeTestJSON-249197895 tempest-ImagesOneServerNegativeTestJSON-249197895-project-member] Lock "704e4d83-19ef-493a-a374-dce0de95e975" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 236.112s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 989.938461] env[68617]: DEBUG nova.compute.manager [None req-9e1cc523-2db5-40d8-9590-d32bffc942bf tempest-ImagesTestJSON-918330909 tempest-ImagesTestJSON-918330909-project-member] [instance: df9a6dc4-abb5-4855-ac4f-5479dd0b6498] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 989.962649] env[68617]: DEBUG nova.compute.manager [None req-9e1cc523-2db5-40d8-9590-d32bffc942bf tempest-ImagesTestJSON-918330909 tempest-ImagesTestJSON-918330909-project-member] [instance: df9a6dc4-abb5-4855-ac4f-5479dd0b6498] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 989.983361] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9e1cc523-2db5-40d8-9590-d32bffc942bf tempest-ImagesTestJSON-918330909 tempest-ImagesTestJSON-918330909-project-member] Lock "df9a6dc4-abb5-4855-ac4f-5479dd0b6498" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 230.696s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 989.992297] env[68617]: DEBUG nova.compute.manager [None req-de31d373-1ac6-424b-8651-58ff44a68371 tempest-ServersAaction247Test-609120817 tempest-ServersAaction247Test-609120817-project-member] [instance: f6f64438-8279-4ff4-ab80-efd1e17d7e04] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 990.015596] env[68617]: DEBUG nova.compute.manager [None req-de31d373-1ac6-424b-8651-58ff44a68371 tempest-ServersAaction247Test-609120817 tempest-ServersAaction247Test-609120817-project-member] [instance: f6f64438-8279-4ff4-ab80-efd1e17d7e04] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 990.037684] env[68617]: DEBUG oslo_concurrency.lockutils [None req-de31d373-1ac6-424b-8651-58ff44a68371 tempest-ServersAaction247Test-609120817 tempest-ServersAaction247Test-609120817-project-member] Lock "f6f64438-8279-4ff4-ab80-efd1e17d7e04" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 230.069s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 990.046917] env[68617]: DEBUG nova.compute.manager [None req-27c732df-2043-469d-9521-c021487abf4b tempest-InstanceActionsTestJSON-641152923 tempest-InstanceActionsTestJSON-641152923-project-member] [instance: f2bef6cc-f5e2-41a8-b377-31f016746257] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 990.070104] env[68617]: DEBUG nova.compute.manager [None req-27c732df-2043-469d-9521-c021487abf4b tempest-InstanceActionsTestJSON-641152923 tempest-InstanceActionsTestJSON-641152923-project-member] [instance: f2bef6cc-f5e2-41a8-b377-31f016746257] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 990.095021] env[68617]: DEBUG oslo_concurrency.lockutils [None req-27c732df-2043-469d-9521-c021487abf4b tempest-InstanceActionsTestJSON-641152923 tempest-InstanceActionsTestJSON-641152923-project-member] Lock "f2bef6cc-f5e2-41a8-b377-31f016746257" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 219.212s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 990.104200] env[68617]: DEBUG nova.compute.manager [None req-645ccf85-ce17-473f-8068-5b0c95db8022 tempest-ServersTestBootFromVolume-1306830610 tempest-ServersTestBootFromVolume-1306830610-project-member] [instance: fe74e8d8-e439-4834-9721-08d9e64c7740] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 990.128435] env[68617]: DEBUG nova.compute.manager [None req-645ccf85-ce17-473f-8068-5b0c95db8022 tempest-ServersTestBootFromVolume-1306830610 tempest-ServersTestBootFromVolume-1306830610-project-member] [instance: fe74e8d8-e439-4834-9721-08d9e64c7740] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 990.153348] env[68617]: DEBUG oslo_concurrency.lockutils [None req-645ccf85-ce17-473f-8068-5b0c95db8022 tempest-ServersTestBootFromVolume-1306830610 tempest-ServersTestBootFromVolume-1306830610-project-member] Lock "fe74e8d8-e439-4834-9721-08d9e64c7740" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.984s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 990.162525] env[68617]: DEBUG nova.compute.manager [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 990.219193] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 990.219479] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 990.221136] env[68617]: INFO nova.compute.claims [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 990.649021] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24871c3b-20ea-463e-94b4-18e270382e4a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 990.657315] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd4027e3-0f2e-495d-b9da-575f08f2682e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 990.687769] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0273505-d3c4-463a-9fb2-af4ce2c94a9e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 990.695093] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad255c0a-3e27-4ed8-9023-b985ef8c345b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 990.708483] env[68617]: DEBUG nova.compute.provider_tree [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 990.719330] env[68617]: DEBUG nova.scheduler.client.report [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 990.735937] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.516s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 990.736462] env[68617]: DEBUG nova.compute.manager [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 990.773396] env[68617]: DEBUG nova.compute.utils [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 990.774748] env[68617]: DEBUG nova.compute.manager [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Not allocating networking since 'none' was specified. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 990.787647] env[68617]: DEBUG nova.compute.manager [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 990.858840] env[68617]: DEBUG nova.compute.manager [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 990.896214] env[68617]: DEBUG nova.virt.hardware [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 990.896812] env[68617]: DEBUG nova.virt.hardware [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 990.897202] env[68617]: DEBUG nova.virt.hardware [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 990.897610] env[68617]: DEBUG nova.virt.hardware [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 990.898654] env[68617]: DEBUG nova.virt.hardware [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 990.898654] env[68617]: DEBUG nova.virt.hardware [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 990.898654] env[68617]: DEBUG nova.virt.hardware [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 990.898654] env[68617]: DEBUG nova.virt.hardware [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 990.899258] env[68617]: DEBUG nova.virt.hardware [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 990.899615] env[68617]: DEBUG nova.virt.hardware [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 990.899711] env[68617]: DEBUG nova.virt.hardware [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 990.902281] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68afeb06-5b1d-4edd-948d-80e572167358 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 990.906454] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0fadadf0-5065-4728-8fe2-813e0fbea4cf tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Acquiring lock "82864ac3-a199-478c-8c57-97ea0a256201" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 990.912491] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6fa3727b-3b82-42ad-bbf6-d53fbd451cc0 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 990.930661] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Instance VIF info [] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 990.936696] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Creating folder: Project (61d32fa492664e7ab91a1cd1cebf7ca8). Parent ref: group-v693691. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 990.937209] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-14cfff82-e600-40c6-8fca-edc06ff33f89 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 990.946879] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Created folder: Project (61d32fa492664e7ab91a1cd1cebf7ca8) in parent group-v693691. [ 990.947076] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Creating folder: Instances. Parent ref: group-v693737. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 990.947289] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-67e5d740-6da7-4ccf-a602-d81132c101e1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 990.955849] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Created folder: Instances in parent group-v693737. [ 990.956092] env[68617]: DEBUG oslo.service.loopingcall [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 990.956273] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 990.956464] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-39bb43bb-3af4-4165-9075-fc402914c718 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 990.972845] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 990.972845] env[68617]: value = "task-3470756" [ 990.972845] env[68617]: _type = "Task" [ 990.972845] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 990.980693] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470756, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 991.483723] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470756, 'name': CreateVM_Task, 'duration_secs': 0.259461} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 991.483723] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 991.484205] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 991.484421] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 991.484788] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 991.485106] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bd938246-90e1-4d50-970d-e807bcae4ade {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 991.489977] env[68617]: DEBUG oslo_vmware.api [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Waiting for the task: (returnval){ [ 991.489977] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52b4ea24-4ada-032d-fe00-591d78861de6" [ 991.489977] env[68617]: _type = "Task" [ 991.489977] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 991.499177] env[68617]: DEBUG oslo_vmware.api [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52b4ea24-4ada-032d-fe00-591d78861de6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 992.001516] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 992.001810] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 992.001987] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1003.699949] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1003.700242] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Starting heal instance info cache {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1003.700298] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Rebuilding the list of instances to heal {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1003.721743] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1003.721890] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1003.722034] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1003.722166] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1003.722291] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1003.722419] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1003.722570] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1003.722700] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1003.722820] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1003.722938] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1003.723071] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Didn't find any instances for network info cache update. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1004.699815] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1004.727861] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1004.728181] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1005.698852] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1005.699155] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1005.699329] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1005.699476] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68617) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1006.699081] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1007.699236] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1007.699553] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1007.710710] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1007.710959] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1007.711163] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1007.711320] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1007.712747] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41c370b0-0dd1-49ed-9a8e-7155fc00ef7d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1007.721414] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc61b7b3-8557-43dc-9596-b1fe16ceee26 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1007.737305] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-298f5ba6-7344-44d0-9abb-2e104071909a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1007.743697] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5350e8ac-ebf3-427d-a6ed-11b49f759996 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1007.773684] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180945MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1007.773684] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1007.773684] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1007.850958] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 050e2b27-1311-4a9a-b5cf-6bc2f7128eba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1007.851149] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f13242a0-7e65-4d68-a317-16fb8c4b8f8a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1007.851273] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6300077d-5aa7-4794-8ba2-1ec30151c15c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1007.851391] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 4ea5887f-84bd-4629-b568-e73c78af0ad4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1007.851506] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6eef6e24-cf49-458b-ae37-8da4e02045f8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1007.851623] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 71b1ebba-2019-4378-9bd2-98a7559c22e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1007.851739] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance e6b6cbdd-11d6-44a6-8da7-98e0f52cef67 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1007.851854] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b27ace75-e2fa-4acc-96cb-88dd49b89de5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1007.851966] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 995585f5-57a4-4ba6-9e28-18a086af264c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1007.852091] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 82864ac3-a199-478c-8c57-97ea0a256201 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1007.864312] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1007.882845] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 39f1e776-4df9-4b24-a51b-c1a15a943a76 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1007.894233] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 79c92a1b-20ef-4360-93b4-913cbfcf92fe has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1007.908013] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 40de8cd1-1c46-4ffb-866b-255386fe44b6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1007.922713] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 1cc76382-5452-4ed4-bb99-c6800c70d42a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1007.936013] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 3258d1a5-7142-4e06-814d-e68fd90262ae has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1007.947753] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance e3a2fb7d-b092-485f-b64a-486c458ba845 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1007.958306] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance eaeae56d-8e71-43bc-8441-49a29c161763 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1007.970155] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 79d8a532-b071-4c79-8c5d-f08438928201 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1007.981172] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance c9e6a9e1-6479-47ba-ae12-0441d2761bb6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1007.991693] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 96bc8135-1233-4569-99ce-c7a529b96d11 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1008.002643] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f560b4df-fb57-4f7b-8a8b-53325970e06e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1008.013695] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance e4ac9902-3e8b-4790-a00b-2fd45f16ff63 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1008.024446] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 85bfa486-9f65-40d6-a392-54fdf87da1a1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1008.039019] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f63c673e-40dc-49d3-b356-85629ada1101 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1008.048910] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance d3b6336e-4baa-426e-a31d-9788cd2131a0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1008.058932] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 21c0de14-cb70-4a41-954f-aaa904d1514a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1008.069143] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b5a088c8-429a-49b3-b330-315d15ace97f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1008.069383] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1008.069528] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1856MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1008.392222] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16d93bb9-f691-48d1-b3fe-651739e1179f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1008.400036] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3cb17498-e764-4df4-81b5-304bfa575915 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1008.433126] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60526149-fa8a-4918-8252-438672534ef1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1008.440413] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f2456eb-abfc-40e6-b835-4b5837ecb6f2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1008.453399] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1008.462301] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1008.476819] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1008.477058] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.703s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1037.230382] env[68617]: WARNING oslo_vmware.rw_handles [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1037.230382] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1037.230382] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1037.230382] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1037.230382] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1037.230382] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 1037.230382] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1037.230382] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1037.230382] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1037.230382] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1037.230382] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1037.230382] env[68617]: ERROR oslo_vmware.rw_handles [ 1037.230382] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/14280aec-a9c8-4af7-99a2-0bce35755808/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1037.232687] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1037.232929] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Copying Virtual Disk [datastore2] vmware_temp/14280aec-a9c8-4af7-99a2-0bce35755808/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/14280aec-a9c8-4af7-99a2-0bce35755808/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1037.233243] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-dd68b6fc-969a-4766-ab2f-1bbba7edf120 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1037.241920] env[68617]: DEBUG oslo_vmware.api [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Waiting for the task: (returnval){ [ 1037.241920] env[68617]: value = "task-3470757" [ 1037.241920] env[68617]: _type = "Task" [ 1037.241920] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1037.249929] env[68617]: DEBUG oslo_vmware.api [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Task: {'id': task-3470757, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1037.753093] env[68617]: DEBUG oslo_vmware.exceptions [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1037.753396] env[68617]: DEBUG oslo_concurrency.lockutils [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1037.753948] env[68617]: ERROR nova.compute.manager [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1037.753948] env[68617]: Faults: ['InvalidArgument'] [ 1037.753948] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Traceback (most recent call last): [ 1037.753948] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1037.753948] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] yield resources [ 1037.753948] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1037.753948] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] self.driver.spawn(context, instance, image_meta, [ 1037.753948] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1037.753948] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1037.753948] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1037.753948] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] self._fetch_image_if_missing(context, vi) [ 1037.753948] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1037.754371] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] image_cache(vi, tmp_image_ds_loc) [ 1037.754371] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1037.754371] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] vm_util.copy_virtual_disk( [ 1037.754371] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1037.754371] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] session._wait_for_task(vmdk_copy_task) [ 1037.754371] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1037.754371] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] return self.wait_for_task(task_ref) [ 1037.754371] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1037.754371] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] return evt.wait() [ 1037.754371] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1037.754371] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] result = hub.switch() [ 1037.754371] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1037.754371] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] return self.greenlet.switch() [ 1037.754795] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1037.754795] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] self.f(*self.args, **self.kw) [ 1037.754795] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1037.754795] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] raise exceptions.translate_fault(task_info.error) [ 1037.754795] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1037.754795] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Faults: ['InvalidArgument'] [ 1037.754795] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] [ 1037.754795] env[68617]: INFO nova.compute.manager [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Terminating instance [ 1037.755904] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1037.756012] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1037.757025] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-dea5c275-ed74-4902-8f2e-25b3952606a2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1037.758459] env[68617]: DEBUG nova.compute.manager [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1037.758642] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1037.759363] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0416577-8593-4f3a-b0d1-7e8ace7570b8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1037.766124] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1037.766367] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-44fa73fb-249d-4e51-b576-f9104f118c26 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1037.768576] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1037.768742] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1037.769699] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d465f86a-a129-4879-acd4-7b655569e11e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1037.774115] env[68617]: DEBUG oslo_vmware.api [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Waiting for the task: (returnval){ [ 1037.774115] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52c98ef3-96a9-853a-872c-a2929b7b32c5" [ 1037.774115] env[68617]: _type = "Task" [ 1037.774115] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1037.783483] env[68617]: DEBUG oslo_vmware.api [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52c98ef3-96a9-853a-872c-a2929b7b32c5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1037.833066] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1037.833345] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1037.833559] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Deleting the datastore file [datastore2] f13242a0-7e65-4d68-a317-16fb8c4b8f8a {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1037.833944] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9d040252-b95f-45d6-b07c-f7a9ee1e9466 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1037.841874] env[68617]: DEBUG oslo_vmware.api [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Waiting for the task: (returnval){ [ 1037.841874] env[68617]: value = "task-3470759" [ 1037.841874] env[68617]: _type = "Task" [ 1037.841874] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1037.847954] env[68617]: DEBUG oslo_vmware.api [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Task: {'id': task-3470759, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1038.284960] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1038.285292] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Creating directory with path [datastore2] vmware_temp/ac79d948-c286-43f9-8e22-a7b70c09c16e/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1038.285481] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2026badf-a48d-4cda-ab1a-ded7c3efe860 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1038.297023] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Created directory with path [datastore2] vmware_temp/ac79d948-c286-43f9-8e22-a7b70c09c16e/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1038.297361] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Fetch image to [datastore2] vmware_temp/ac79d948-c286-43f9-8e22-a7b70c09c16e/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1038.297546] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/ac79d948-c286-43f9-8e22-a7b70c09c16e/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1038.298262] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe676b60-2a6b-4b46-8c34-c38415fb5145 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1038.304831] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90862292-a774-4330-be24-e7e4e4743222 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1038.314230] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc075f97-48c6-4935-bf36-012426e57edf {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1038.350192] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d274c339-63ef-4b14-83ce-3a4b5ff171a3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1038.359054] env[68617]: DEBUG oslo_vmware.api [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Task: {'id': task-3470759, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075319} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1038.360811] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1038.361051] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1038.361273] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1038.361487] env[68617]: INFO nova.compute.manager [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1038.363333] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5dfe9a0e-4a78-4837-a717-a7297adc5644 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1038.365367] env[68617]: DEBUG nova.compute.claims [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1038.365542] env[68617]: DEBUG oslo_concurrency.lockutils [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1038.365750] env[68617]: DEBUG oslo_concurrency.lockutils [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1038.397768] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1038.457340] env[68617]: DEBUG oslo_vmware.rw_handles [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ac79d948-c286-43f9-8e22-a7b70c09c16e/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1038.516669] env[68617]: DEBUG oslo_vmware.rw_handles [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1038.516669] env[68617]: DEBUG oslo_vmware.rw_handles [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ac79d948-c286-43f9-8e22-a7b70c09c16e/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1038.804827] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cccf9c54-373c-4c2e-8bf2-b0b3260612f5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1038.812797] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3882813-60f9-4de0-85c9-dc076ff8ad28 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1038.841817] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a583509a-86de-421a-ac26-5e8df58ed05d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1038.850220] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6769a180-4d98-4072-801e-e0c9a38ff74c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1038.863138] env[68617]: DEBUG nova.compute.provider_tree [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1038.871450] env[68617]: DEBUG nova.scheduler.client.report [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1038.886419] env[68617]: DEBUG oslo_concurrency.lockutils [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.521s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1038.886988] env[68617]: ERROR nova.compute.manager [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1038.886988] env[68617]: Faults: ['InvalidArgument'] [ 1038.886988] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Traceback (most recent call last): [ 1038.886988] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1038.886988] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] self.driver.spawn(context, instance, image_meta, [ 1038.886988] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1038.886988] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1038.886988] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1038.886988] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] self._fetch_image_if_missing(context, vi) [ 1038.886988] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1038.886988] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] image_cache(vi, tmp_image_ds_loc) [ 1038.886988] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1038.887550] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] vm_util.copy_virtual_disk( [ 1038.887550] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1038.887550] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] session._wait_for_task(vmdk_copy_task) [ 1038.887550] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1038.887550] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] return self.wait_for_task(task_ref) [ 1038.887550] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1038.887550] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] return evt.wait() [ 1038.887550] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1038.887550] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] result = hub.switch() [ 1038.887550] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1038.887550] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] return self.greenlet.switch() [ 1038.887550] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1038.887550] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] self.f(*self.args, **self.kw) [ 1038.888111] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1038.888111] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] raise exceptions.translate_fault(task_info.error) [ 1038.888111] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1038.888111] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Faults: ['InvalidArgument'] [ 1038.888111] env[68617]: ERROR nova.compute.manager [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] [ 1038.888111] env[68617]: DEBUG nova.compute.utils [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1038.889090] env[68617]: DEBUG nova.compute.manager [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Build of instance f13242a0-7e65-4d68-a317-16fb8c4b8f8a was re-scheduled: A specified parameter was not correct: fileType [ 1038.889090] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1038.889479] env[68617]: DEBUG nova.compute.manager [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1038.889649] env[68617]: DEBUG nova.compute.manager [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1038.889814] env[68617]: DEBUG nova.compute.manager [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1038.889973] env[68617]: DEBUG nova.network.neutron [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1039.222366] env[68617]: DEBUG nova.network.neutron [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1039.234215] env[68617]: INFO nova.compute.manager [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: f13242a0-7e65-4d68-a317-16fb8c4b8f8a] Took 0.34 seconds to deallocate network for instance. [ 1039.327791] env[68617]: INFO nova.scheduler.client.report [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Deleted allocations for instance f13242a0-7e65-4d68-a317-16fb8c4b8f8a [ 1039.347444] env[68617]: DEBUG oslo_concurrency.lockutils [None req-434342dc-c906-44dc-859f-ce230cf71873 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Lock "f13242a0-7e65-4d68-a317-16fb8c4b8f8a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 338.750s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1039.363631] env[68617]: DEBUG nova.compute.manager [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1039.420439] env[68617]: DEBUG oslo_concurrency.lockutils [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1039.420693] env[68617]: DEBUG oslo_concurrency.lockutils [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1039.422163] env[68617]: INFO nova.compute.claims [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1039.781478] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79ea9d51-2434-4e55-bf77-2f04034cfa91 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1039.789194] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fdb8dec-a870-4012-8050-631e366f34d9 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1039.818258] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-094a125d-0845-4c71-93bd-16373cafeb32 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1039.825392] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b75ec5c-3b24-4f67-9f22-fc288a402d76 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1039.840061] env[68617]: DEBUG nova.compute.provider_tree [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1039.848615] env[68617]: DEBUG nova.scheduler.client.report [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1039.863403] env[68617]: DEBUG oslo_concurrency.lockutils [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.443s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1039.863877] env[68617]: DEBUG nova.compute.manager [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1039.896461] env[68617]: DEBUG nova.compute.utils [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1039.897936] env[68617]: DEBUG nova.compute.manager [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1039.898119] env[68617]: DEBUG nova.network.neutron [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1039.906404] env[68617]: DEBUG nova.compute.manager [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1039.958520] env[68617]: DEBUG nova.policy [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b0a8591289ef42a5ac552b78056a9e2f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2f6a22998f8246a9b1bc32e095da5913', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 1039.974371] env[68617]: DEBUG nova.compute.manager [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1040.000024] env[68617]: DEBUG nova.virt.hardware [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1040.000024] env[68617]: DEBUG nova.virt.hardware [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1040.000024] env[68617]: DEBUG nova.virt.hardware [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1040.000242] env[68617]: DEBUG nova.virt.hardware [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1040.000242] env[68617]: DEBUG nova.virt.hardware [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1040.000242] env[68617]: DEBUG nova.virt.hardware [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1040.000460] env[68617]: DEBUG nova.virt.hardware [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1040.000743] env[68617]: DEBUG nova.virt.hardware [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1040.001060] env[68617]: DEBUG nova.virt.hardware [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1040.001343] env[68617]: DEBUG nova.virt.hardware [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1040.001648] env[68617]: DEBUG nova.virt.hardware [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1040.002662] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10dd7a54-ff83-44a6-9288-dae119d5a84a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1040.012842] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe3f30cc-a05f-4af0-aad9-49e000f5e656 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1040.269620] env[68617]: DEBUG nova.network.neutron [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Successfully created port: 62a52506-3eed-42b5-ac21-0775f5a6234b {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1040.908421] env[68617]: DEBUG nova.network.neutron [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Successfully updated port: 62a52506-3eed-42b5-ac21-0775f5a6234b {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1040.919182] env[68617]: DEBUG oslo_concurrency.lockutils [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Acquiring lock "refresh_cache-dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1040.920355] env[68617]: DEBUG oslo_concurrency.lockutils [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Acquired lock "refresh_cache-dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1040.920355] env[68617]: DEBUG nova.network.neutron [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1040.964289] env[68617]: DEBUG nova.network.neutron [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1041.364811] env[68617]: DEBUG nova.network.neutron [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Updating instance_info_cache with network_info: [{"id": "62a52506-3eed-42b5-ac21-0775f5a6234b", "address": "fa:16:3e:3f:bc:f8", "network": {"id": "02b903e8-24c7-4b84-9a8e-8cf3b18abc74", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2041230651-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2f6a22998f8246a9b1bc32e095da5913", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6815237d-f565-474d-a3c0-9c675478eb00", "external-id": "nsx-vlan-transportzone-526", "segmentation_id": 526, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap62a52506-3e", "ovs_interfaceid": "62a52506-3eed-42b5-ac21-0775f5a6234b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1041.382363] env[68617]: DEBUG oslo_concurrency.lockutils [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Releasing lock "refresh_cache-dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1041.382363] env[68617]: DEBUG nova.compute.manager [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Instance network_info: |[{"id": "62a52506-3eed-42b5-ac21-0775f5a6234b", "address": "fa:16:3e:3f:bc:f8", "network": {"id": "02b903e8-24c7-4b84-9a8e-8cf3b18abc74", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2041230651-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2f6a22998f8246a9b1bc32e095da5913", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6815237d-f565-474d-a3c0-9c675478eb00", "external-id": "nsx-vlan-transportzone-526", "segmentation_id": 526, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap62a52506-3e", "ovs_interfaceid": "62a52506-3eed-42b5-ac21-0775f5a6234b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1041.382584] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:3f:bc:f8', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6815237d-f565-474d-a3c0-9c675478eb00', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '62a52506-3eed-42b5-ac21-0775f5a6234b', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1041.389616] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Creating folder: Project (2f6a22998f8246a9b1bc32e095da5913). Parent ref: group-v693691. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1041.391812] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d1170efc-6400-4e26-8549-e2d51bfd795f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1041.403575] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Created folder: Project (2f6a22998f8246a9b1bc32e095da5913) in parent group-v693691. [ 1041.403764] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Creating folder: Instances. Parent ref: group-v693740. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1041.403986] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d6350d7f-4b8b-4480-8afa-27e7d8f0e23d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1041.412164] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Created folder: Instances in parent group-v693740. [ 1041.412404] env[68617]: DEBUG oslo.service.loopingcall [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1041.412553] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1041.412739] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-68676183-66b4-4d1f-bedc-192eb018b44d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1041.430771] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1041.430771] env[68617]: value = "task-3470762" [ 1041.430771] env[68617]: _type = "Task" [ 1041.430771] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1041.438413] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470762, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1041.688359] env[68617]: DEBUG nova.compute.manager [req-24d300b7-48cc-411b-8c81-7f26da20c001 req-c825c297-d45b-4d6c-a119-588116aa2278 service nova] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Received event network-vif-plugged-62a52506-3eed-42b5-ac21-0775f5a6234b {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1041.688679] env[68617]: DEBUG oslo_concurrency.lockutils [req-24d300b7-48cc-411b-8c81-7f26da20c001 req-c825c297-d45b-4d6c-a119-588116aa2278 service nova] Acquiring lock "dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1041.688935] env[68617]: DEBUG oslo_concurrency.lockutils [req-24d300b7-48cc-411b-8c81-7f26da20c001 req-c825c297-d45b-4d6c-a119-588116aa2278 service nova] Lock "dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1041.689211] env[68617]: DEBUG oslo_concurrency.lockutils [req-24d300b7-48cc-411b-8c81-7f26da20c001 req-c825c297-d45b-4d6c-a119-588116aa2278 service nova] Lock "dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1041.689405] env[68617]: DEBUG nova.compute.manager [req-24d300b7-48cc-411b-8c81-7f26da20c001 req-c825c297-d45b-4d6c-a119-588116aa2278 service nova] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] No waiting events found dispatching network-vif-plugged-62a52506-3eed-42b5-ac21-0775f5a6234b {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1041.689721] env[68617]: WARNING nova.compute.manager [req-24d300b7-48cc-411b-8c81-7f26da20c001 req-c825c297-d45b-4d6c-a119-588116aa2278 service nova] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Received unexpected event network-vif-plugged-62a52506-3eed-42b5-ac21-0775f5a6234b for instance with vm_state building and task_state spawning. [ 1041.689968] env[68617]: DEBUG nova.compute.manager [req-24d300b7-48cc-411b-8c81-7f26da20c001 req-c825c297-d45b-4d6c-a119-588116aa2278 service nova] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Received event network-changed-62a52506-3eed-42b5-ac21-0775f5a6234b {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1041.690193] env[68617]: DEBUG nova.compute.manager [req-24d300b7-48cc-411b-8c81-7f26da20c001 req-c825c297-d45b-4d6c-a119-588116aa2278 service nova] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Refreshing instance network info cache due to event network-changed-62a52506-3eed-42b5-ac21-0775f5a6234b. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1041.690427] env[68617]: DEBUG oslo_concurrency.lockutils [req-24d300b7-48cc-411b-8c81-7f26da20c001 req-c825c297-d45b-4d6c-a119-588116aa2278 service nova] Acquiring lock "refresh_cache-dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1041.690599] env[68617]: DEBUG oslo_concurrency.lockutils [req-24d300b7-48cc-411b-8c81-7f26da20c001 req-c825c297-d45b-4d6c-a119-588116aa2278 service nova] Acquired lock "refresh_cache-dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1041.690781] env[68617]: DEBUG nova.network.neutron [req-24d300b7-48cc-411b-8c81-7f26da20c001 req-c825c297-d45b-4d6c-a119-588116aa2278 service nova] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Refreshing network info cache for port 62a52506-3eed-42b5-ac21-0775f5a6234b {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1041.921021] env[68617]: DEBUG nova.network.neutron [req-24d300b7-48cc-411b-8c81-7f26da20c001 req-c825c297-d45b-4d6c-a119-588116aa2278 service nova] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Updated VIF entry in instance network info cache for port 62a52506-3eed-42b5-ac21-0775f5a6234b. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1041.921431] env[68617]: DEBUG nova.network.neutron [req-24d300b7-48cc-411b-8c81-7f26da20c001 req-c825c297-d45b-4d6c-a119-588116aa2278 service nova] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Updating instance_info_cache with network_info: [{"id": "62a52506-3eed-42b5-ac21-0775f5a6234b", "address": "fa:16:3e:3f:bc:f8", "network": {"id": "02b903e8-24c7-4b84-9a8e-8cf3b18abc74", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2041230651-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2f6a22998f8246a9b1bc32e095da5913", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6815237d-f565-474d-a3c0-9c675478eb00", "external-id": "nsx-vlan-transportzone-526", "segmentation_id": 526, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap62a52506-3e", "ovs_interfaceid": "62a52506-3eed-42b5-ac21-0775f5a6234b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1041.930591] env[68617]: DEBUG oslo_concurrency.lockutils [req-24d300b7-48cc-411b-8c81-7f26da20c001 req-c825c297-d45b-4d6c-a119-588116aa2278 service nova] Releasing lock "refresh_cache-dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1041.942784] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470762, 'name': CreateVM_Task, 'duration_secs': 0.302342} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1041.942943] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1041.943525] env[68617]: DEBUG oslo_concurrency.lockutils [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1041.943735] env[68617]: DEBUG oslo_concurrency.lockutils [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1041.943967] env[68617]: DEBUG oslo_concurrency.lockutils [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1041.944209] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ca06122a-bc39-430e-b35e-64de7312c7cf {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1041.948464] env[68617]: DEBUG oslo_vmware.api [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Waiting for the task: (returnval){ [ 1041.948464] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]5209566a-34f0-23de-b7d2-613cda41bbfe" [ 1041.948464] env[68617]: _type = "Task" [ 1041.948464] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1041.956078] env[68617]: DEBUG oslo_vmware.api [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]5209566a-34f0-23de-b7d2-613cda41bbfe, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1042.460253] env[68617]: DEBUG oslo_concurrency.lockutils [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1042.460253] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1042.460253] env[68617]: DEBUG oslo_concurrency.lockutils [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1045.213173] env[68617]: DEBUG oslo_concurrency.lockutils [None req-de0db407-7a80-4611-b802-00eb6401471a tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Acquiring lock "dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1065.476780] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1065.477182] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Starting heal instance info cache {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1065.477182] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Rebuilding the list of instances to heal {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1065.499312] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1065.499504] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1065.499717] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1065.499899] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1065.500045] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1065.500174] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1065.500297] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1065.500417] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1065.500536] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1065.500660] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1065.500780] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Didn't find any instances for network info cache update. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1066.698529] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1066.698871] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1066.698953] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1066.699124] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1066.699273] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1066.699418] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1066.699574] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68617) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1067.699622] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1069.700044] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1069.710894] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1069.711135] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1069.711378] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1069.711553] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1069.712673] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bf60896-3814-42a1-ac92-0dd4aff601f1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1069.721797] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f5ba9e2-b664-4aec-be84-a28c606685dc {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1069.735522] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c03faf9-eb32-4faa-b220-409e98391521 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1069.742139] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91125c52-dada-4225-9e0b-3d6b2a853363 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1069.772570] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180940MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1069.772721] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1069.772911] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1069.847193] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 050e2b27-1311-4a9a-b5cf-6bc2f7128eba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1069.847354] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6300077d-5aa7-4794-8ba2-1ec30151c15c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1069.847482] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 4ea5887f-84bd-4629-b568-e73c78af0ad4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1069.847736] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6eef6e24-cf49-458b-ae37-8da4e02045f8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1069.847900] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 71b1ebba-2019-4378-9bd2-98a7559c22e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1069.848041] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance e6b6cbdd-11d6-44a6-8da7-98e0f52cef67 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1069.848162] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b27ace75-e2fa-4acc-96cb-88dd49b89de5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1069.848336] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 995585f5-57a4-4ba6-9e28-18a086af264c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1069.848482] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 82864ac3-a199-478c-8c57-97ea0a256201 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1069.848602] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1069.862699] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 79c92a1b-20ef-4360-93b4-913cbfcf92fe has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1069.873232] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 40de8cd1-1c46-4ffb-866b-255386fe44b6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1069.884611] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 1cc76382-5452-4ed4-bb99-c6800c70d42a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1069.894258] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 3258d1a5-7142-4e06-814d-e68fd90262ae has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1069.920389] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance e3a2fb7d-b092-485f-b64a-486c458ba845 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1069.930711] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance eaeae56d-8e71-43bc-8441-49a29c161763 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1069.943947] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 79d8a532-b071-4c79-8c5d-f08438928201 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1069.954364] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance c9e6a9e1-6479-47ba-ae12-0441d2761bb6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1069.964250] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 96bc8135-1233-4569-99ce-c7a529b96d11 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1069.994783] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f560b4df-fb57-4f7b-8a8b-53325970e06e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1070.005218] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance e4ac9902-3e8b-4790-a00b-2fd45f16ff63 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1070.015624] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 85bfa486-9f65-40d6-a392-54fdf87da1a1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1070.024903] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f63c673e-40dc-49d3-b356-85629ada1101 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1070.035067] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance d3b6336e-4baa-426e-a31d-9788cd2131a0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1070.044099] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 21c0de14-cb70-4a41-954f-aaa904d1514a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1070.054308] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b5a088c8-429a-49b3-b330-315d15ace97f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1070.054542] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1070.054694] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1856MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1070.314122] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9dcc856e-fbbb-4afa-8c3f-78ec0a870e70 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1070.322062] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-208459d5-a6b8-42f9-90f5-902bb4242be6 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1070.353032] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce5752f4-bd7e-41d4-a6b9-af68cc43fd7f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1070.359388] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-719adf64-38ae-4d8a-aceb-52a646623848 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1070.372081] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1070.381389] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1070.395562] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1070.395743] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.623s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1087.742607] env[68617]: WARNING oslo_vmware.rw_handles [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1087.742607] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1087.742607] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1087.742607] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1087.742607] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1087.742607] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 1087.742607] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1087.742607] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1087.742607] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1087.742607] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1087.742607] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1087.742607] env[68617]: ERROR oslo_vmware.rw_handles [ 1087.743131] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/ac79d948-c286-43f9-8e22-a7b70c09c16e/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1087.745349] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1087.745604] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Copying Virtual Disk [datastore2] vmware_temp/ac79d948-c286-43f9-8e22-a7b70c09c16e/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/ac79d948-c286-43f9-8e22-a7b70c09c16e/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1087.745926] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-5026a392-f2af-459c-9fd4-79f355769727 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1087.754510] env[68617]: DEBUG oslo_vmware.api [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Waiting for the task: (returnval){ [ 1087.754510] env[68617]: value = "task-3470763" [ 1087.754510] env[68617]: _type = "Task" [ 1087.754510] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1087.762603] env[68617]: DEBUG oslo_vmware.api [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Task: {'id': task-3470763, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1088.264426] env[68617]: DEBUG oslo_vmware.exceptions [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1088.264738] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1088.265276] env[68617]: ERROR nova.compute.manager [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1088.265276] env[68617]: Faults: ['InvalidArgument'] [ 1088.265276] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Traceback (most recent call last): [ 1088.265276] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1088.265276] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] yield resources [ 1088.265276] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1088.265276] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] self.driver.spawn(context, instance, image_meta, [ 1088.265276] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1088.265276] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1088.265276] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1088.265276] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] self._fetch_image_if_missing(context, vi) [ 1088.265276] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1088.265536] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] image_cache(vi, tmp_image_ds_loc) [ 1088.265536] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1088.265536] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] vm_util.copy_virtual_disk( [ 1088.265536] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1088.265536] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] session._wait_for_task(vmdk_copy_task) [ 1088.265536] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1088.265536] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] return self.wait_for_task(task_ref) [ 1088.265536] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1088.265536] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] return evt.wait() [ 1088.265536] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1088.265536] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] result = hub.switch() [ 1088.265536] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1088.265536] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] return self.greenlet.switch() [ 1088.265832] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1088.265832] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] self.f(*self.args, **self.kw) [ 1088.265832] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1088.265832] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] raise exceptions.translate_fault(task_info.error) [ 1088.265832] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1088.265832] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Faults: ['InvalidArgument'] [ 1088.265832] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] [ 1088.265832] env[68617]: INFO nova.compute.manager [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Terminating instance [ 1088.267606] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1088.268365] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1088.268486] env[68617]: DEBUG nova.compute.manager [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1088.268680] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1088.268894] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b621c4e2-577f-4262-82c0-1612d1ab3e54 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1088.271426] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8e06bb1-0486-4d9a-ae4c-b2bcb6d389d3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1088.277967] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1088.278275] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-a3c94dce-5c60-4bc6-b09f-137184976166 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1088.280693] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1088.280867] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1088.281835] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-433950ae-77cf-4789-bb22-f0918043d653 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1088.286561] env[68617]: DEBUG oslo_vmware.api [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Waiting for the task: (returnval){ [ 1088.286561] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52d7ee60-bf9b-71ce-e1c2-77bbbf1ce406" [ 1088.286561] env[68617]: _type = "Task" [ 1088.286561] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1088.293374] env[68617]: DEBUG oslo_vmware.api [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52d7ee60-bf9b-71ce-e1c2-77bbbf1ce406, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1088.349064] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1088.349064] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1088.349064] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Deleting the datastore file [datastore2] 050e2b27-1311-4a9a-b5cf-6bc2f7128eba {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1088.349064] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-af835b2e-9bae-4504-a9e9-b6f62ebc4d8f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1088.354602] env[68617]: DEBUG oslo_vmware.api [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Waiting for the task: (returnval){ [ 1088.354602] env[68617]: value = "task-3470765" [ 1088.354602] env[68617]: _type = "Task" [ 1088.354602] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1088.362963] env[68617]: DEBUG oslo_vmware.api [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Task: {'id': task-3470765, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1088.796743] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1088.797023] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Creating directory with path [datastore2] vmware_temp/df9ed3ed-2ce7-4abf-a003-2b46558bd40c/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1088.797245] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e0dbed5c-e0bc-4eb6-b010-36d9e1f41ffb {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1088.807731] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Created directory with path [datastore2] vmware_temp/df9ed3ed-2ce7-4abf-a003-2b46558bd40c/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1088.807924] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Fetch image to [datastore2] vmware_temp/df9ed3ed-2ce7-4abf-a003-2b46558bd40c/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1088.808110] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/df9ed3ed-2ce7-4abf-a003-2b46558bd40c/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1088.808818] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa63dca0-1137-45e5-ad12-c2813809996f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1088.815353] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7259c2e4-c0ae-4724-9da8-1171a1f3860a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1088.824132] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e09b20f3-dc2e-46c8-8117-af569243564d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1088.853953] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78327158-5865-4779-9c27-72411f115438 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1088.864449] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-fc035d89-2610-464e-8337-8c4706ef598a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1088.866076] env[68617]: DEBUG oslo_vmware.api [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Task: {'id': task-3470765, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.06654} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1088.866307] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1088.866492] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1088.866662] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1088.866829] env[68617]: INFO nova.compute.manager [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1088.868871] env[68617]: DEBUG nova.compute.claims [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1088.869076] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1088.869266] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1088.887465] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1088.940459] env[68617]: DEBUG oslo_vmware.rw_handles [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/df9ed3ed-2ce7-4abf-a003-2b46558bd40c/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1089.000017] env[68617]: DEBUG oslo_vmware.rw_handles [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1089.000216] env[68617]: DEBUG oslo_vmware.rw_handles [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/df9ed3ed-2ce7-4abf-a003-2b46558bd40c/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1089.263846] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc1c557a-7173-4809-af91-d73b0637f05f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1089.271424] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46316480-31f9-44c1-b868-c78bc3c9e0ba {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1089.304021] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe1af191-c3b6-4def-8211-f62654f71af7 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1089.310980] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbb667c4-4e88-41cc-8a00-aec0d41f577f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1089.323620] env[68617]: DEBUG nova.compute.provider_tree [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1089.335488] env[68617]: DEBUG nova.scheduler.client.report [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1089.351024] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.482s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1089.351511] env[68617]: ERROR nova.compute.manager [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1089.351511] env[68617]: Faults: ['InvalidArgument'] [ 1089.351511] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Traceback (most recent call last): [ 1089.351511] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1089.351511] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] self.driver.spawn(context, instance, image_meta, [ 1089.351511] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1089.351511] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1089.351511] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1089.351511] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] self._fetch_image_if_missing(context, vi) [ 1089.351511] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1089.351511] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] image_cache(vi, tmp_image_ds_loc) [ 1089.351511] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1089.351784] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] vm_util.copy_virtual_disk( [ 1089.351784] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1089.351784] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] session._wait_for_task(vmdk_copy_task) [ 1089.351784] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1089.351784] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] return self.wait_for_task(task_ref) [ 1089.351784] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1089.351784] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] return evt.wait() [ 1089.351784] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1089.351784] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] result = hub.switch() [ 1089.351784] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1089.351784] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] return self.greenlet.switch() [ 1089.351784] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1089.351784] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] self.f(*self.args, **self.kw) [ 1089.352085] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1089.352085] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] raise exceptions.translate_fault(task_info.error) [ 1089.352085] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1089.352085] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Faults: ['InvalidArgument'] [ 1089.352085] env[68617]: ERROR nova.compute.manager [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] [ 1089.352255] env[68617]: DEBUG nova.compute.utils [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1089.353663] env[68617]: DEBUG nova.compute.manager [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Build of instance 050e2b27-1311-4a9a-b5cf-6bc2f7128eba was re-scheduled: A specified parameter was not correct: fileType [ 1089.353663] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1089.354040] env[68617]: DEBUG nova.compute.manager [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1089.354220] env[68617]: DEBUG nova.compute.manager [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1089.354375] env[68617]: DEBUG nova.compute.manager [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1089.354537] env[68617]: DEBUG nova.network.neutron [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1089.666933] env[68617]: DEBUG nova.network.neutron [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1089.678242] env[68617]: INFO nova.compute.manager [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Took 0.32 seconds to deallocate network for instance. [ 1089.778280] env[68617]: INFO nova.scheduler.client.report [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Deleted allocations for instance 050e2b27-1311-4a9a-b5cf-6bc2f7128eba [ 1089.801059] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ac3becde-4643-43a1-a4db-698ab7a219c1 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Lock "050e2b27-1311-4a9a-b5cf-6bc2f7128eba" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 389.585s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1089.802668] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f7e12e17-4e6f-4de7-8a04-8e13af42ffe6 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Lock "050e2b27-1311-4a9a-b5cf-6bc2f7128eba" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 187.980s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1089.803263] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f7e12e17-4e6f-4de7-8a04-8e13af42ffe6 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Acquiring lock "050e2b27-1311-4a9a-b5cf-6bc2f7128eba-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1089.803504] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f7e12e17-4e6f-4de7-8a04-8e13af42ffe6 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Lock "050e2b27-1311-4a9a-b5cf-6bc2f7128eba-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1089.803683] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f7e12e17-4e6f-4de7-8a04-8e13af42ffe6 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Lock "050e2b27-1311-4a9a-b5cf-6bc2f7128eba-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1089.805937] env[68617]: INFO nova.compute.manager [None req-f7e12e17-4e6f-4de7-8a04-8e13af42ffe6 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Terminating instance [ 1089.808030] env[68617]: DEBUG nova.compute.manager [None req-f7e12e17-4e6f-4de7-8a04-8e13af42ffe6 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1089.808278] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-f7e12e17-4e6f-4de7-8a04-8e13af42ffe6 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1089.808752] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d52db70e-3cdd-4462-b831-7c740ce306d8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1089.814316] env[68617]: DEBUG nova.compute.manager [None req-232bfc7e-96ff-4f45-9df5-245b28c10087 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: 39f1e776-4df9-4b24-a51b-c1a15a943a76] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1089.820969] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6fcedcf2-fb5a-4844-9866-8895471c7bb3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1089.850933] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-f7e12e17-4e6f-4de7-8a04-8e13af42ffe6 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 050e2b27-1311-4a9a-b5cf-6bc2f7128eba could not be found. [ 1089.851235] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-f7e12e17-4e6f-4de7-8a04-8e13af42ffe6 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1089.851643] env[68617]: INFO nova.compute.manager [None req-f7e12e17-4e6f-4de7-8a04-8e13af42ffe6 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1089.851734] env[68617]: DEBUG oslo.service.loopingcall [None req-f7e12e17-4e6f-4de7-8a04-8e13af42ffe6 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1089.852595] env[68617]: DEBUG nova.compute.manager [None req-232bfc7e-96ff-4f45-9df5-245b28c10087 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: 39f1e776-4df9-4b24-a51b-c1a15a943a76] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1089.854486] env[68617]: DEBUG nova.compute.manager [-] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1089.854486] env[68617]: DEBUG nova.network.neutron [-] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1089.876038] env[68617]: DEBUG oslo_concurrency.lockutils [None req-232bfc7e-96ff-4f45-9df5-245b28c10087 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Lock "39f1e776-4df9-4b24-a51b-c1a15a943a76" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 240.562s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1089.882213] env[68617]: DEBUG nova.network.neutron [-] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1089.885024] env[68617]: DEBUG nova.compute.manager [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1089.890914] env[68617]: INFO nova.compute.manager [-] [instance: 050e2b27-1311-4a9a-b5cf-6bc2f7128eba] Took 0.04 seconds to deallocate network for instance. [ 1089.936398] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1089.936664] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1089.938314] env[68617]: INFO nova.compute.claims [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1090.001745] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f7e12e17-4e6f-4de7-8a04-8e13af42ffe6 tempest-ServersAdminNegativeTestJSON-1454940715 tempest-ServersAdminNegativeTestJSON-1454940715-project-member] Lock "050e2b27-1311-4a9a-b5cf-6bc2f7128eba" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.199s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1090.263250] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fdd42d84-aa52-4f46-81bd-449fce1e70f5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1090.270745] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-526b62d7-a12c-48db-8039-10a5a0abb868 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1090.303975] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b6b41ce-68bd-47d4-b270-b944fa4798a6 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1090.311164] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2639162-eb45-4d4c-a30d-a7c5671dd5cc {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1090.324579] env[68617]: DEBUG nova.compute.provider_tree [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1090.333143] env[68617]: DEBUG nova.scheduler.client.report [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1090.346538] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.410s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1090.347032] env[68617]: DEBUG nova.compute.manager [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1090.384020] env[68617]: DEBUG nova.compute.utils [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1090.385131] env[68617]: DEBUG nova.compute.manager [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1090.385308] env[68617]: DEBUG nova.network.neutron [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1090.394810] env[68617]: DEBUG nova.compute.manager [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1090.463471] env[68617]: DEBUG nova.compute.manager [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1090.467050] env[68617]: DEBUG nova.policy [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '25f775593b3649b884be0bb38c2100b2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '851d03d686cc400c89d6ef7ab051840b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 1090.489761] env[68617]: DEBUG nova.virt.hardware [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1090.489996] env[68617]: DEBUG nova.virt.hardware [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1090.490168] env[68617]: DEBUG nova.virt.hardware [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1090.490349] env[68617]: DEBUG nova.virt.hardware [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1090.490489] env[68617]: DEBUG nova.virt.hardware [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1090.490630] env[68617]: DEBUG nova.virt.hardware [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1090.490833] env[68617]: DEBUG nova.virt.hardware [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1090.491017] env[68617]: DEBUG nova.virt.hardware [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1090.491814] env[68617]: DEBUG nova.virt.hardware [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1090.491997] env[68617]: DEBUG nova.virt.hardware [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1090.492190] env[68617]: DEBUG nova.virt.hardware [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1090.493874] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8cab2f3e-d7da-4d22-bd7a-501b74488365 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1090.502308] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ff04816-3071-4b88-bbd3-5801cbea710f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1090.832241] env[68617]: DEBUG nova.network.neutron [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Successfully created port: 86fe5703-6b39-4629-9a20-387fbb55f31c {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1091.404694] env[68617]: DEBUG nova.network.neutron [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Successfully updated port: 86fe5703-6b39-4629-9a20-387fbb55f31c {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1091.424671] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Acquiring lock "refresh_cache-79c92a1b-20ef-4360-93b4-913cbfcf92fe" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1091.424671] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Acquired lock "refresh_cache-79c92a1b-20ef-4360-93b4-913cbfcf92fe" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1091.424671] env[68617]: DEBUG nova.network.neutron [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1091.473693] env[68617]: DEBUG nova.network.neutron [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1091.658187] env[68617]: DEBUG nova.network.neutron [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Updating instance_info_cache with network_info: [{"id": "86fe5703-6b39-4629-9a20-387fbb55f31c", "address": "fa:16:3e:ea:6a:ea", "network": {"id": "e3aee9db-8596-4ea8-943e-5c365382ee22", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.116", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "f1a3ab6230dd468b8019424ce71de8ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cde23701-02ca-4cb4-b5a6-d321f8ac9660", "external-id": "nsx-vlan-transportzone-586", "segmentation_id": 586, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap86fe5703-6b", "ovs_interfaceid": "86fe5703-6b39-4629-9a20-387fbb55f31c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1091.669577] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Releasing lock "refresh_cache-79c92a1b-20ef-4360-93b4-913cbfcf92fe" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1091.669875] env[68617]: DEBUG nova.compute.manager [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Instance network_info: |[{"id": "86fe5703-6b39-4629-9a20-387fbb55f31c", "address": "fa:16:3e:ea:6a:ea", "network": {"id": "e3aee9db-8596-4ea8-943e-5c365382ee22", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.116", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "f1a3ab6230dd468b8019424ce71de8ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cde23701-02ca-4cb4-b5a6-d321f8ac9660", "external-id": "nsx-vlan-transportzone-586", "segmentation_id": 586, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap86fe5703-6b", "ovs_interfaceid": "86fe5703-6b39-4629-9a20-387fbb55f31c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1091.670368] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ea:6a:ea', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'cde23701-02ca-4cb4-b5a6-d321f8ac9660', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '86fe5703-6b39-4629-9a20-387fbb55f31c', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1091.678563] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Creating folder: Project (851d03d686cc400c89d6ef7ab051840b). Parent ref: group-v693691. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1091.679221] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5dc657d0-9396-4807-a697-a2f4963c5902 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1091.689388] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Created folder: Project (851d03d686cc400c89d6ef7ab051840b) in parent group-v693691. [ 1091.689587] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Creating folder: Instances. Parent ref: group-v693743. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1091.689811] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f459f3b4-04f0-420d-8e75-d7c53c2a30e7 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1091.700771] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Created folder: Instances in parent group-v693743. [ 1091.701079] env[68617]: DEBUG oslo.service.loopingcall [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1091.701306] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1091.701544] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e6050151-7a7f-4b74-9d77-f61215f914da {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1091.722425] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1091.722425] env[68617]: value = "task-3470768" [ 1091.722425] env[68617]: _type = "Task" [ 1091.722425] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1091.730406] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470768, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1091.851174] env[68617]: DEBUG nova.compute.manager [req-42862abc-8b98-4c2d-a0f5-9b25bb05a58c req-c48f7a79-bbd7-4743-93d1-b2daa63ef479 service nova] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Received event network-vif-plugged-86fe5703-6b39-4629-9a20-387fbb55f31c {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1091.853996] env[68617]: DEBUG oslo_concurrency.lockutils [req-42862abc-8b98-4c2d-a0f5-9b25bb05a58c req-c48f7a79-bbd7-4743-93d1-b2daa63ef479 service nova] Acquiring lock "79c92a1b-20ef-4360-93b4-913cbfcf92fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1091.853996] env[68617]: DEBUG oslo_concurrency.lockutils [req-42862abc-8b98-4c2d-a0f5-9b25bb05a58c req-c48f7a79-bbd7-4743-93d1-b2daa63ef479 service nova] Lock "79c92a1b-20ef-4360-93b4-913cbfcf92fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1091.853996] env[68617]: DEBUG oslo_concurrency.lockutils [req-42862abc-8b98-4c2d-a0f5-9b25bb05a58c req-c48f7a79-bbd7-4743-93d1-b2daa63ef479 service nova] Lock "79c92a1b-20ef-4360-93b4-913cbfcf92fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1091.853996] env[68617]: DEBUG nova.compute.manager [req-42862abc-8b98-4c2d-a0f5-9b25bb05a58c req-c48f7a79-bbd7-4743-93d1-b2daa63ef479 service nova] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] No waiting events found dispatching network-vif-plugged-86fe5703-6b39-4629-9a20-387fbb55f31c {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1091.854396] env[68617]: WARNING nova.compute.manager [req-42862abc-8b98-4c2d-a0f5-9b25bb05a58c req-c48f7a79-bbd7-4743-93d1-b2daa63ef479 service nova] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Received unexpected event network-vif-plugged-86fe5703-6b39-4629-9a20-387fbb55f31c for instance with vm_state building and task_state spawning. [ 1091.854396] env[68617]: DEBUG nova.compute.manager [req-42862abc-8b98-4c2d-a0f5-9b25bb05a58c req-c48f7a79-bbd7-4743-93d1-b2daa63ef479 service nova] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Received event network-changed-86fe5703-6b39-4629-9a20-387fbb55f31c {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1091.854396] env[68617]: DEBUG nova.compute.manager [req-42862abc-8b98-4c2d-a0f5-9b25bb05a58c req-c48f7a79-bbd7-4743-93d1-b2daa63ef479 service nova] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Refreshing instance network info cache due to event network-changed-86fe5703-6b39-4629-9a20-387fbb55f31c. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1091.854396] env[68617]: DEBUG oslo_concurrency.lockutils [req-42862abc-8b98-4c2d-a0f5-9b25bb05a58c req-c48f7a79-bbd7-4743-93d1-b2daa63ef479 service nova] Acquiring lock "refresh_cache-79c92a1b-20ef-4360-93b4-913cbfcf92fe" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1091.854396] env[68617]: DEBUG oslo_concurrency.lockutils [req-42862abc-8b98-4c2d-a0f5-9b25bb05a58c req-c48f7a79-bbd7-4743-93d1-b2daa63ef479 service nova] Acquired lock "refresh_cache-79c92a1b-20ef-4360-93b4-913cbfcf92fe" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1091.854514] env[68617]: DEBUG nova.network.neutron [req-42862abc-8b98-4c2d-a0f5-9b25bb05a58c req-c48f7a79-bbd7-4743-93d1-b2daa63ef479 service nova] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Refreshing network info cache for port 86fe5703-6b39-4629-9a20-387fbb55f31c {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1092.103321] env[68617]: DEBUG nova.network.neutron [req-42862abc-8b98-4c2d-a0f5-9b25bb05a58c req-c48f7a79-bbd7-4743-93d1-b2daa63ef479 service nova] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Updated VIF entry in instance network info cache for port 86fe5703-6b39-4629-9a20-387fbb55f31c. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1092.103678] env[68617]: DEBUG nova.network.neutron [req-42862abc-8b98-4c2d-a0f5-9b25bb05a58c req-c48f7a79-bbd7-4743-93d1-b2daa63ef479 service nova] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Updating instance_info_cache with network_info: [{"id": "86fe5703-6b39-4629-9a20-387fbb55f31c", "address": "fa:16:3e:ea:6a:ea", "network": {"id": "e3aee9db-8596-4ea8-943e-5c365382ee22", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.116", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "f1a3ab6230dd468b8019424ce71de8ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cde23701-02ca-4cb4-b5a6-d321f8ac9660", "external-id": "nsx-vlan-transportzone-586", "segmentation_id": 586, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap86fe5703-6b", "ovs_interfaceid": "86fe5703-6b39-4629-9a20-387fbb55f31c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1092.113037] env[68617]: DEBUG oslo_concurrency.lockutils [req-42862abc-8b98-4c2d-a0f5-9b25bb05a58c req-c48f7a79-bbd7-4743-93d1-b2daa63ef479 service nova] Releasing lock "refresh_cache-79c92a1b-20ef-4360-93b4-913cbfcf92fe" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1092.232044] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470768, 'name': CreateVM_Task, 'duration_secs': 0.31111} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1092.232210] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1092.232860] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1092.233034] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1092.233340] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1092.233575] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4c07eafd-3003-4632-acfc-f19bac96ebea {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1092.237887] env[68617]: DEBUG oslo_vmware.api [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Waiting for the task: (returnval){ [ 1092.237887] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]5291925b-8288-ba7b-cac4-815e5562b050" [ 1092.237887] env[68617]: _type = "Task" [ 1092.237887] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1092.250846] env[68617]: DEBUG oslo_vmware.api [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]5291925b-8288-ba7b-cac4-815e5562b050, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1092.750108] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1092.750399] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1092.750635] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1097.457544] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquiring lock "1cc42c7f-8781-40b0-9f75-edfef3bc90e7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1097.457828] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Lock "1cc42c7f-8781-40b0-9f75-edfef3bc90e7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1097.484588] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquiring lock "d46ca6f3-0ee9-412c-98b4-f639ce4f9228" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1097.486269] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Lock "d46ca6f3-0ee9-412c-98b4-f639ce4f9228" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1097.516974] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquiring lock "a8ff6232-530c-453a-96e4-f8ce00f976e3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1097.517229] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Lock "a8ff6232-530c-453a-96e4-f8ce00f976e3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1098.016685] env[68617]: DEBUG oslo_concurrency.lockutils [None req-442ecf20-674c-496c-a6f1-86f1a70c287c tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Acquiring lock "79c92a1b-20ef-4360-93b4-913cbfcf92fe" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1104.718813] env[68617]: DEBUG oslo_concurrency.lockutils [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Acquiring lock "5f31aef1-4806-48e1-9d5a-5dff09ea0f0d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1104.719317] env[68617]: DEBUG oslo_concurrency.lockutils [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Lock "5f31aef1-4806-48e1-9d5a-5dff09ea0f0d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1105.752026] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0e1b8687-2cf3-4567-937f-3f76cec5553d tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "030eceb1-51a5-4e34-ad67-727b7ebd524f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1105.752423] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0e1b8687-2cf3-4567-937f-3f76cec5553d tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "030eceb1-51a5-4e34-ad67-727b7ebd524f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1107.092783] env[68617]: DEBUG oslo_concurrency.lockutils [None req-56391919-a768-45ee-beff-9807502f0df1 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquiring lock "4ea5887f-84bd-4629-b568-e73c78af0ad4" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1114.736116] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ba5b7b45-1fa5-4ef2-8ea9-dc1c7e6ef22e tempest-FloatingIPsAssociationNegativeTestJSON-296212251 tempest-FloatingIPsAssociationNegativeTestJSON-296212251-project-member] Acquiring lock "07927d19-2354-4215-b89d-5920e20b8222" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1114.736116] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ba5b7b45-1fa5-4ef2-8ea9-dc1c7e6ef22e tempest-FloatingIPsAssociationNegativeTestJSON-296212251 tempest-FloatingIPsAssociationNegativeTestJSON-296212251-project-member] Lock "07927d19-2354-4215-b89d-5920e20b8222" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1119.641309] env[68617]: DEBUG oslo_concurrency.lockutils [None req-513971dc-01da-410d-ae01-53e625bf6a3c tempest-AttachInterfacesV270Test-135274226 tempest-AttachInterfacesV270Test-135274226-project-member] Acquiring lock "59df690b-bfbb-4976-b80b-60106c53ba25" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1119.641596] env[68617]: DEBUG oslo_concurrency.lockutils [None req-513971dc-01da-410d-ae01-53e625bf6a3c tempest-AttachInterfacesV270Test-135274226 tempest-AttachInterfacesV270Test-135274226-project-member] Lock "59df690b-bfbb-4976-b80b-60106c53ba25" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1125.395917] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1125.396165] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Starting heal instance info cache {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1125.396220] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Rebuilding the list of instances to heal {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1125.435299] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1125.435445] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1125.435576] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1125.435713] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1125.435844] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1125.435961] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1125.436093] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1125.436211] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1125.436325] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1125.436441] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1125.436561] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Didn't find any instances for network info cache update. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1126.699336] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1126.742022] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1126.742022] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1126.742022] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1127.698855] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1127.699115] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1128.694604] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1128.698286] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1128.698438] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68617) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1131.698578] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1131.715110] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1131.715337] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1131.715498] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1131.715652] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1131.717254] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ebdd862-e1fe-4f94-addb-71a0010e0db6 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1131.726048] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b28c4af9-01df-4162-b1e0-e5bedc775db6 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1131.743120] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a174af7-b75f-4ac5-863f-d5d4eda38aeb {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1131.750890] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d7bff8c-ef32-4d5e-a22f-2cbe24babebd {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1131.786774] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180937MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1131.786969] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1131.787192] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1131.880407] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6300077d-5aa7-4794-8ba2-1ec30151c15c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1131.880570] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 4ea5887f-84bd-4629-b568-e73c78af0ad4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1131.880696] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6eef6e24-cf49-458b-ae37-8da4e02045f8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1131.880811] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 71b1ebba-2019-4378-9bd2-98a7559c22e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1131.880933] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance e6b6cbdd-11d6-44a6-8da7-98e0f52cef67 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1131.881065] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b27ace75-e2fa-4acc-96cb-88dd49b89de5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1131.881184] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 995585f5-57a4-4ba6-9e28-18a086af264c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1131.881301] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 82864ac3-a199-478c-8c57-97ea0a256201 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1131.881416] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1131.881529] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 79c92a1b-20ef-4360-93b4-913cbfcf92fe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1131.896481] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 79d8a532-b071-4c79-8c5d-f08438928201 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1131.907668] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance c9e6a9e1-6479-47ba-ae12-0441d2761bb6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1131.921712] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 96bc8135-1233-4569-99ce-c7a529b96d11 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1131.933532] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f560b4df-fb57-4f7b-8a8b-53325970e06e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1131.946526] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance e4ac9902-3e8b-4790-a00b-2fd45f16ff63 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1131.956932] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 85bfa486-9f65-40d6-a392-54fdf87da1a1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1131.968515] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f63c673e-40dc-49d3-b356-85629ada1101 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1131.978568] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance d3b6336e-4baa-426e-a31d-9788cd2131a0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1131.992594] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 21c0de14-cb70-4a41-954f-aaa904d1514a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1132.003398] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b5a088c8-429a-49b3-b330-315d15ace97f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1132.014135] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 1cc42c7f-8781-40b0-9f75-edfef3bc90e7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1132.025837] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance d46ca6f3-0ee9-412c-98b4-f639ce4f9228 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1132.035934] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance a8ff6232-530c-453a-96e4-f8ce00f976e3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1132.046457] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1132.056634] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 030eceb1-51a5-4e34-ad67-727b7ebd524f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1132.068256] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 07927d19-2354-4215-b89d-5920e20b8222 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1132.079022] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 59df690b-bfbb-4976-b80b-60106c53ba25 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1132.079022] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1132.079022] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1856MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1132.508668] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa3c922b-6106-49f5-b981-b13d03c1856a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1132.516377] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-baf89b54-3a40-4734-887e-5d483d6de5a9 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1132.547389] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b117f4d7-b3a6-4369-a8ca-7aeb753750b0 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1132.554938] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67fef4b8-296a-4830-b28f-5992cc66f0d8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1132.568217] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1132.587109] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1132.614736] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1132.615868] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.828s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1136.993725] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d40a1f2b-bd26-4fa5-9cc2-e377610bb628 tempest-ServersTestManualDisk-623166759 tempest-ServersTestManualDisk-623166759-project-member] Acquiring lock "98b47fc9-678d-4c60-b9e5-78423719ae76" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1136.994041] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d40a1f2b-bd26-4fa5-9cc2-e377610bb628 tempest-ServersTestManualDisk-623166759 tempest-ServersTestManualDisk-623166759-project-member] Lock "98b47fc9-678d-4c60-b9e5-78423719ae76" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1137.266270] env[68617]: WARNING oslo_vmware.rw_handles [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1137.266270] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1137.266270] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1137.266270] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1137.266270] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1137.266270] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 1137.266270] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1137.266270] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1137.266270] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1137.266270] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1137.266270] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1137.266270] env[68617]: ERROR oslo_vmware.rw_handles [ 1137.266796] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/df9ed3ed-2ce7-4abf-a003-2b46558bd40c/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1137.268853] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1137.269198] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Copying Virtual Disk [datastore2] vmware_temp/df9ed3ed-2ce7-4abf-a003-2b46558bd40c/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/df9ed3ed-2ce7-4abf-a003-2b46558bd40c/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1137.271468] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6b0c90b4-4fb0-4072-863d-11206d14e5e0 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1137.278345] env[68617]: DEBUG oslo_vmware.api [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Waiting for the task: (returnval){ [ 1137.278345] env[68617]: value = "task-3470776" [ 1137.278345] env[68617]: _type = "Task" [ 1137.278345] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1137.288379] env[68617]: DEBUG oslo_vmware.api [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Task: {'id': task-3470776, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1137.790322] env[68617]: DEBUG oslo_vmware.exceptions [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1137.790762] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1137.792150] env[68617]: ERROR nova.compute.manager [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1137.792150] env[68617]: Faults: ['InvalidArgument'] [ 1137.792150] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Traceback (most recent call last): [ 1137.792150] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1137.792150] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] yield resources [ 1137.792150] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1137.792150] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] self.driver.spawn(context, instance, image_meta, [ 1137.792150] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1137.792150] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1137.792150] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1137.792150] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] self._fetch_image_if_missing(context, vi) [ 1137.792150] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1137.792548] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] image_cache(vi, tmp_image_ds_loc) [ 1137.792548] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1137.792548] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] vm_util.copy_virtual_disk( [ 1137.792548] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1137.792548] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] session._wait_for_task(vmdk_copy_task) [ 1137.792548] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1137.792548] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] return self.wait_for_task(task_ref) [ 1137.792548] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1137.792548] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] return evt.wait() [ 1137.792548] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1137.792548] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] result = hub.switch() [ 1137.792548] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1137.792548] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] return self.greenlet.switch() [ 1137.792902] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1137.792902] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] self.f(*self.args, **self.kw) [ 1137.792902] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1137.792902] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] raise exceptions.translate_fault(task_info.error) [ 1137.792902] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1137.792902] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Faults: ['InvalidArgument'] [ 1137.792902] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] [ 1137.792902] env[68617]: INFO nova.compute.manager [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Terminating instance [ 1137.793336] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1137.793544] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1137.794182] env[68617]: DEBUG nova.compute.manager [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1137.794369] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1137.794599] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-16f610c4-02af-4997-84c5-0794e592e5ae {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1137.796989] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebbb0abc-ddaf-4a7a-831a-7215d26c303e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1137.804281] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1137.804484] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f082a253-89f5-44c1-8dc2-83b41da87c85 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1137.806918] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1137.807114] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1137.808083] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-878a2b47-e77a-4063-8fbe-674f924ef35f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1137.813695] env[68617]: DEBUG oslo_vmware.api [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Waiting for the task: (returnval){ [ 1137.813695] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52a25102-2c91-0843-f380-db80861b1b13" [ 1137.813695] env[68617]: _type = "Task" [ 1137.813695] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1137.821285] env[68617]: DEBUG oslo_vmware.api [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52a25102-2c91-0843-f380-db80861b1b13, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1137.894573] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1137.894971] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1137.895456] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Deleting the datastore file [datastore2] 6300077d-5aa7-4794-8ba2-1ec30151c15c {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1137.895744] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-39db4ea4-d6bb-4c7b-a2b8-b52d0da0c08e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1137.901884] env[68617]: DEBUG oslo_vmware.api [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Waiting for the task: (returnval){ [ 1137.901884] env[68617]: value = "task-3470778" [ 1137.901884] env[68617]: _type = "Task" [ 1137.901884] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1137.909476] env[68617]: DEBUG oslo_vmware.api [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Task: {'id': task-3470778, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1138.327375] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1138.327375] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Creating directory with path [datastore2] vmware_temp/e9eb4fb6-dcae-437e-8e45-7f7d0ade28ce/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1138.327375] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b572e4f4-a6f4-4100-8cae-3e5289ed8cef {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1138.338416] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Created directory with path [datastore2] vmware_temp/e9eb4fb6-dcae-437e-8e45-7f7d0ade28ce/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1138.338838] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Fetch image to [datastore2] vmware_temp/e9eb4fb6-dcae-437e-8e45-7f7d0ade28ce/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1138.338838] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/e9eb4fb6-dcae-437e-8e45-7f7d0ade28ce/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1138.339793] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6112e2f-07b5-4f39-ad8d-9ee9e1a82952 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1138.346756] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cfb5cb7b-a4f4-47d6-b2a3-7271e628af74 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1138.355647] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbf48667-c07f-4e51-93a2-8db2d256aa55 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1138.388841] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bac73eb8-3232-404a-bc0a-1a2c995d5dc1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1138.393412] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-9f10868d-fdb4-494d-ab0c-495468fc55a7 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1138.412219] env[68617]: DEBUG oslo_vmware.api [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Task: {'id': task-3470778, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066815} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1138.412625] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1138.413420] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1138.413675] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1138.413894] env[68617]: INFO nova.compute.manager [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1138.419809] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1138.419809] env[68617]: DEBUG nova.compute.claims [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1138.420055] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1138.420186] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1138.507077] env[68617]: DEBUG oslo_vmware.rw_handles [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e9eb4fb6-dcae-437e-8e45-7f7d0ade28ce/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1138.580019] env[68617]: DEBUG oslo_vmware.rw_handles [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1138.580019] env[68617]: DEBUG oslo_vmware.rw_handles [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e9eb4fb6-dcae-437e-8e45-7f7d0ade28ce/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1139.002800] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bb1f4f2-a16c-4767-ac50-9e7af6cc8e0d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1139.012456] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6193833a-4126-45b2-9b7d-67be927113ee {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1139.045168] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79ae82ad-6cdc-4c4c-b803-2fce76a9cf22 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1139.053140] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31afbe91-9d8e-4f41-9618-421d11a92e8e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1139.069662] env[68617]: DEBUG nova.compute.provider_tree [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1139.089040] env[68617]: DEBUG nova.scheduler.client.report [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1139.121973] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.699s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1139.121973] env[68617]: ERROR nova.compute.manager [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1139.121973] env[68617]: Faults: ['InvalidArgument'] [ 1139.121973] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Traceback (most recent call last): [ 1139.121973] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1139.121973] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] self.driver.spawn(context, instance, image_meta, [ 1139.121973] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1139.121973] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1139.121973] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1139.121973] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] self._fetch_image_if_missing(context, vi) [ 1139.122365] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1139.122365] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] image_cache(vi, tmp_image_ds_loc) [ 1139.122365] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1139.122365] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] vm_util.copy_virtual_disk( [ 1139.122365] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1139.122365] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] session._wait_for_task(vmdk_copy_task) [ 1139.122365] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1139.122365] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] return self.wait_for_task(task_ref) [ 1139.122365] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1139.122365] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] return evt.wait() [ 1139.122365] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1139.122365] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] result = hub.switch() [ 1139.122365] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1139.122708] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] return self.greenlet.switch() [ 1139.122708] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1139.122708] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] self.f(*self.args, **self.kw) [ 1139.122708] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1139.122708] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] raise exceptions.translate_fault(task_info.error) [ 1139.122708] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1139.122708] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Faults: ['InvalidArgument'] [ 1139.122708] env[68617]: ERROR nova.compute.manager [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] [ 1139.122708] env[68617]: DEBUG nova.compute.utils [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1139.123931] env[68617]: DEBUG nova.compute.manager [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Build of instance 6300077d-5aa7-4794-8ba2-1ec30151c15c was re-scheduled: A specified parameter was not correct: fileType [ 1139.123931] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1139.124546] env[68617]: DEBUG nova.compute.manager [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1139.124852] env[68617]: DEBUG nova.compute.manager [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1139.125147] env[68617]: DEBUG nova.compute.manager [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1139.125440] env[68617]: DEBUG nova.network.neutron [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1140.106318] env[68617]: DEBUG nova.network.neutron [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1140.136534] env[68617]: INFO nova.compute.manager [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Took 1.01 seconds to deallocate network for instance. [ 1140.694102] env[68617]: INFO nova.scheduler.client.report [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Deleted allocations for instance 6300077d-5aa7-4794-8ba2-1ec30151c15c [ 1140.752845] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e2fb04cc-4476-41d3-9e39-707bf7a5ae86 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Lock "6300077d-5aa7-4794-8ba2-1ec30151c15c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 437.375s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1140.754162] env[68617]: DEBUG oslo_concurrency.lockutils [None req-518992bc-62fb-4e9a-b170-108b82689824 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Lock "6300077d-5aa7-4794-8ba2-1ec30151c15c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 236.391s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1140.754787] env[68617]: DEBUG oslo_concurrency.lockutils [None req-518992bc-62fb-4e9a-b170-108b82689824 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Acquiring lock "6300077d-5aa7-4794-8ba2-1ec30151c15c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1140.754787] env[68617]: DEBUG oslo_concurrency.lockutils [None req-518992bc-62fb-4e9a-b170-108b82689824 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Lock "6300077d-5aa7-4794-8ba2-1ec30151c15c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1140.754787] env[68617]: DEBUG oslo_concurrency.lockutils [None req-518992bc-62fb-4e9a-b170-108b82689824 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Lock "6300077d-5aa7-4794-8ba2-1ec30151c15c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1140.759623] env[68617]: INFO nova.compute.manager [None req-518992bc-62fb-4e9a-b170-108b82689824 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Terminating instance [ 1140.762707] env[68617]: DEBUG nova.compute.manager [None req-518992bc-62fb-4e9a-b170-108b82689824 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1140.762899] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-518992bc-62fb-4e9a-b170-108b82689824 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1140.767084] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8bb2bf42-3ffe-4cfb-85b4-568ec37e0f70 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1140.772385] env[68617]: DEBUG nova.compute.manager [None req-9263fd01-4686-49e9-a410-a88e49136d17 tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 40de8cd1-1c46-4ffb-866b-255386fe44b6] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1140.782355] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1e0dcd4-b167-409d-8bed-656d683c7df9 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1140.823269] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-518992bc-62fb-4e9a-b170-108b82689824 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 6300077d-5aa7-4794-8ba2-1ec30151c15c could not be found. [ 1140.823476] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-518992bc-62fb-4e9a-b170-108b82689824 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1140.823649] env[68617]: INFO nova.compute.manager [None req-518992bc-62fb-4e9a-b170-108b82689824 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Took 0.06 seconds to destroy the instance on the hypervisor. [ 1140.823889] env[68617]: DEBUG oslo.service.loopingcall [None req-518992bc-62fb-4e9a-b170-108b82689824 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1140.824123] env[68617]: DEBUG nova.compute.manager [-] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1140.824217] env[68617]: DEBUG nova.network.neutron [-] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1140.845673] env[68617]: DEBUG nova.compute.manager [None req-9263fd01-4686-49e9-a410-a88e49136d17 tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 40de8cd1-1c46-4ffb-866b-255386fe44b6] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1140.913331] env[68617]: DEBUG nova.network.neutron [-] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1140.919099] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9263fd01-4686-49e9-a410-a88e49136d17 tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Lock "40de8cd1-1c46-4ffb-866b-255386fe44b6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 239.857s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1140.936033] env[68617]: INFO nova.compute.manager [-] [instance: 6300077d-5aa7-4794-8ba2-1ec30151c15c] Took 0.11 seconds to deallocate network for instance. [ 1140.959366] env[68617]: DEBUG nova.compute.manager [None req-f5125ede-cdae-41f7-b164-3802b3036641 tempest-VolumesAdminNegativeTest-561724217 tempest-VolumesAdminNegativeTest-561724217-project-member] [instance: 1cc76382-5452-4ed4-bb99-c6800c70d42a] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1141.014305] env[68617]: DEBUG nova.compute.manager [None req-f5125ede-cdae-41f7-b164-3802b3036641 tempest-VolumesAdminNegativeTest-561724217 tempest-VolumesAdminNegativeTest-561724217-project-member] [instance: 1cc76382-5452-4ed4-bb99-c6800c70d42a] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1141.065047] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f5125ede-cdae-41f7-b164-3802b3036641 tempest-VolumesAdminNegativeTest-561724217 tempest-VolumesAdminNegativeTest-561724217-project-member] Lock "1cc76382-5452-4ed4-bb99-c6800c70d42a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 227.125s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1141.089296] env[68617]: DEBUG nova.compute.manager [None req-d6c7d1a5-b7ee-48d5-b286-c143104f8926 tempest-ServersTestMultiNic-884689889 tempest-ServersTestMultiNic-884689889-project-member] [instance: 3258d1a5-7142-4e06-814d-e68fd90262ae] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1141.147153] env[68617]: DEBUG nova.compute.manager [None req-d6c7d1a5-b7ee-48d5-b286-c143104f8926 tempest-ServersTestMultiNic-884689889 tempest-ServersTestMultiNic-884689889-project-member] [instance: 3258d1a5-7142-4e06-814d-e68fd90262ae] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1141.194987] env[68617]: DEBUG oslo_concurrency.lockutils [None req-518992bc-62fb-4e9a-b170-108b82689824 tempest-ServerDiagnosticsTest-773527931 tempest-ServerDiagnosticsTest-773527931-project-member] Lock "6300077d-5aa7-4794-8ba2-1ec30151c15c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.441s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1141.207170] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d6c7d1a5-b7ee-48d5-b286-c143104f8926 tempest-ServersTestMultiNic-884689889 tempest-ServersTestMultiNic-884689889-project-member] Lock "3258d1a5-7142-4e06-814d-e68fd90262ae" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 217.254s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1141.243832] env[68617]: DEBUG nova.compute.manager [None req-df11a94d-2e23-4a9d-904f-e8df2d8982ce tempest-MigrationsAdminTest-1112293401 tempest-MigrationsAdminTest-1112293401-project-member] [instance: e3a2fb7d-b092-485f-b64a-486c458ba845] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1141.323026] env[68617]: DEBUG nova.compute.manager [None req-df11a94d-2e23-4a9d-904f-e8df2d8982ce tempest-MigrationsAdminTest-1112293401 tempest-MigrationsAdminTest-1112293401-project-member] [instance: e3a2fb7d-b092-485f-b64a-486c458ba845] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1141.430233] env[68617]: DEBUG oslo_concurrency.lockutils [None req-df11a94d-2e23-4a9d-904f-e8df2d8982ce tempest-MigrationsAdminTest-1112293401 tempest-MigrationsAdminTest-1112293401-project-member] Lock "e3a2fb7d-b092-485f-b64a-486c458ba845" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 208.355s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1141.455032] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Acquiring lock "e90877a8-47d3-47d7-8362-5bcfe3a98c36" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1141.457246] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Lock "e90877a8-47d3-47d7-8362-5bcfe3a98c36" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1141.457633] env[68617]: DEBUG nova.compute.manager [None req-b221c568-8686-45ed-a2d9-100ed1519d21 tempest-SecurityGroupsTestJSON-1069621129 tempest-SecurityGroupsTestJSON-1069621129-project-member] [instance: eaeae56d-8e71-43bc-8441-49a29c161763] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1141.502538] env[68617]: DEBUG nova.compute.manager [None req-b221c568-8686-45ed-a2d9-100ed1519d21 tempest-SecurityGroupsTestJSON-1069621129 tempest-SecurityGroupsTestJSON-1069621129-project-member] [instance: eaeae56d-8e71-43bc-8441-49a29c161763] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1141.537262] env[68617]: DEBUG oslo_concurrency.lockutils [None req-b221c568-8686-45ed-a2d9-100ed1519d21 tempest-SecurityGroupsTestJSON-1069621129 tempest-SecurityGroupsTestJSON-1069621129-project-member] Lock "eaeae56d-8e71-43bc-8441-49a29c161763" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 206.953s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1141.556465] env[68617]: DEBUG nova.compute.manager [None req-37833af3-316f-48e8-98d4-ffffff3a5894 tempest-ServerRescueNegativeTestJSON-1564947093 tempest-ServerRescueNegativeTestJSON-1564947093-project-member] [instance: 79d8a532-b071-4c79-8c5d-f08438928201] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1141.589889] env[68617]: DEBUG nova.compute.manager [None req-37833af3-316f-48e8-98d4-ffffff3a5894 tempest-ServerRescueNegativeTestJSON-1564947093 tempest-ServerRescueNegativeTestJSON-1564947093-project-member] [instance: 79d8a532-b071-4c79-8c5d-f08438928201] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1141.621835] env[68617]: DEBUG oslo_concurrency.lockutils [None req-37833af3-316f-48e8-98d4-ffffff3a5894 tempest-ServerRescueNegativeTestJSON-1564947093 tempest-ServerRescueNegativeTestJSON-1564947093-project-member] Lock "79d8a532-b071-4c79-8c5d-f08438928201" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 201.292s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1141.639421] env[68617]: DEBUG nova.compute.manager [None req-8fbf2960-ea12-4872-8e64-f7390800461a tempest-ServerRescueNegativeTestJSON-1564947093 tempest-ServerRescueNegativeTestJSON-1564947093-project-member] [instance: c9e6a9e1-6479-47ba-ae12-0441d2761bb6] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1141.679127] env[68617]: DEBUG nova.compute.manager [None req-8fbf2960-ea12-4872-8e64-f7390800461a tempest-ServerRescueNegativeTestJSON-1564947093 tempest-ServerRescueNegativeTestJSON-1564947093-project-member] [instance: c9e6a9e1-6479-47ba-ae12-0441d2761bb6] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1141.706394] env[68617]: DEBUG oslo_concurrency.lockutils [None req-8fbf2960-ea12-4872-8e64-f7390800461a tempest-ServerRescueNegativeTestJSON-1564947093 tempest-ServerRescueNegativeTestJSON-1564947093-project-member] Lock "c9e6a9e1-6479-47ba-ae12-0441d2761bb6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 199.596s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1141.723134] env[68617]: DEBUG nova.compute.manager [None req-b69cc1c6-c860-42b7-9835-b29069b7969d tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: 96bc8135-1233-4569-99ce-c7a529b96d11] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1141.754954] env[68617]: DEBUG nova.compute.manager [None req-b69cc1c6-c860-42b7-9835-b29069b7969d tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: 96bc8135-1233-4569-99ce-c7a529b96d11] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1141.787914] env[68617]: DEBUG oslo_concurrency.lockutils [None req-b69cc1c6-c860-42b7-9835-b29069b7969d tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Lock "96bc8135-1233-4569-99ce-c7a529b96d11" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 199.535s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1141.799024] env[68617]: DEBUG nova.compute.manager [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1141.936746] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1141.936988] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1141.938482] env[68617]: INFO nova.compute.claims [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1142.392649] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e6c4781-d099-4a59-9da6-f4a37769abd4 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1142.400131] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1f465c3-9852-4229-b0d9-ae8535f73e3b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1142.429991] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2fade9b-616e-413f-a5e3-111aadd7e4d8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1142.436915] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72639807-2e85-467c-9507-2c1638caa6ed {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1142.450353] env[68617]: DEBUG nova.compute.provider_tree [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1142.468133] env[68617]: DEBUG nova.scheduler.client.report [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1142.500815] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.564s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1142.501347] env[68617]: DEBUG nova.compute.manager [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1142.551390] env[68617]: DEBUG nova.compute.utils [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1142.552734] env[68617]: DEBUG nova.compute.manager [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1142.552902] env[68617]: DEBUG nova.network.neutron [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1142.562247] env[68617]: DEBUG nova.compute.manager [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1142.630490] env[68617]: DEBUG nova.policy [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1e154d9552a242baae9169208dd17a64', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bb40c285c7b849f4959da3d6b0428062', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 1142.645766] env[68617]: DEBUG nova.compute.manager [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1142.701404] env[68617]: DEBUG nova.virt.hardware [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1142.701404] env[68617]: DEBUG nova.virt.hardware [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1142.701404] env[68617]: DEBUG nova.virt.hardware [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1142.701532] env[68617]: DEBUG nova.virt.hardware [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1142.701532] env[68617]: DEBUG nova.virt.hardware [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1142.701532] env[68617]: DEBUG nova.virt.hardware [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1142.701532] env[68617]: DEBUG nova.virt.hardware [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1142.701532] env[68617]: DEBUG nova.virt.hardware [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1142.701719] env[68617]: DEBUG nova.virt.hardware [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1142.701719] env[68617]: DEBUG nova.virt.hardware [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1142.701719] env[68617]: DEBUG nova.virt.hardware [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1142.702462] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c8c9c6f-daaa-408b-919f-308a168b5317 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1142.716303] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f10ecaf-9553-477d-ad68-52fb0afc55fb {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1143.119860] env[68617]: DEBUG nova.network.neutron [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Successfully created port: d015097b-b390-48d4-95a8-e5a6581af921 {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1143.249990] env[68617]: DEBUG oslo_concurrency.lockutils [None req-736e2c4f-639a-4754-a01a-e2e8f03ac09f tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Acquiring lock "f560b4df-fb57-4f7b-8a8b-53325970e06e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1143.718589] env[68617]: DEBUG oslo_concurrency.lockutils [None req-277d5d4d-ea80-411f-8178-19d3676a982d tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Acquiring lock "c5764a1d-3370-4756-ada0-03b503368d17" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1143.719080] env[68617]: DEBUG oslo_concurrency.lockutils [None req-277d5d4d-ea80-411f-8178-19d3676a982d tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Lock "c5764a1d-3370-4756-ada0-03b503368d17" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1143.760448] env[68617]: DEBUG oslo_concurrency.lockutils [None req-277d5d4d-ea80-411f-8178-19d3676a982d tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Acquiring lock "c0528a20-34cb-4b51-bb4c-8c3828021a85" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1143.760680] env[68617]: DEBUG oslo_concurrency.lockutils [None req-277d5d4d-ea80-411f-8178-19d3676a982d tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Lock "c0528a20-34cb-4b51-bb4c-8c3828021a85" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1144.102294] env[68617]: DEBUG nova.compute.manager [req-2189eeb3-d6f8-4393-84be-fcf769e5b0af req-af4f9966-7bc2-4d62-9102-5c17fb3313a9 service nova] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Received event network-vif-plugged-d015097b-b390-48d4-95a8-e5a6581af921 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1144.102294] env[68617]: DEBUG oslo_concurrency.lockutils [req-2189eeb3-d6f8-4393-84be-fcf769e5b0af req-af4f9966-7bc2-4d62-9102-5c17fb3313a9 service nova] Acquiring lock "f560b4df-fb57-4f7b-8a8b-53325970e06e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1144.102294] env[68617]: DEBUG oslo_concurrency.lockutils [req-2189eeb3-d6f8-4393-84be-fcf769e5b0af req-af4f9966-7bc2-4d62-9102-5c17fb3313a9 service nova] Lock "f560b4df-fb57-4f7b-8a8b-53325970e06e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1144.102294] env[68617]: DEBUG oslo_concurrency.lockutils [req-2189eeb3-d6f8-4393-84be-fcf769e5b0af req-af4f9966-7bc2-4d62-9102-5c17fb3313a9 service nova] Lock "f560b4df-fb57-4f7b-8a8b-53325970e06e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1144.102764] env[68617]: DEBUG nova.compute.manager [req-2189eeb3-d6f8-4393-84be-fcf769e5b0af req-af4f9966-7bc2-4d62-9102-5c17fb3313a9 service nova] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] No waiting events found dispatching network-vif-plugged-d015097b-b390-48d4-95a8-e5a6581af921 {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1144.102764] env[68617]: WARNING nova.compute.manager [req-2189eeb3-d6f8-4393-84be-fcf769e5b0af req-af4f9966-7bc2-4d62-9102-5c17fb3313a9 service nova] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Received unexpected event network-vif-plugged-d015097b-b390-48d4-95a8-e5a6581af921 for instance with vm_state building and task_state deleting. [ 1144.209916] env[68617]: DEBUG nova.network.neutron [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Successfully updated port: d015097b-b390-48d4-95a8-e5a6581af921 {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1144.246068] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Acquiring lock "refresh_cache-f560b4df-fb57-4f7b-8a8b-53325970e06e" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1144.246068] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Acquired lock "refresh_cache-f560b4df-fb57-4f7b-8a8b-53325970e06e" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1144.246068] env[68617]: DEBUG nova.network.neutron [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1144.285417] env[68617]: DEBUG nova.network.neutron [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1144.677257] env[68617]: DEBUG nova.network.neutron [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Updating instance_info_cache with network_info: [{"id": "d015097b-b390-48d4-95a8-e5a6581af921", "address": "fa:16:3e:f6:23:f0", "network": {"id": "e1f9ca9c-cc82-4132-a12d-8e55cdae0d1d", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-936932594-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bb40c285c7b849f4959da3d6b0428062", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d6fab536-1e48-4d07-992a-076f0e6d089c", "external-id": "nsx-vlan-transportzone-61", "segmentation_id": 61, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd015097b-b3", "ovs_interfaceid": "d015097b-b390-48d4-95a8-e5a6581af921", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1144.707754] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Releasing lock "refresh_cache-f560b4df-fb57-4f7b-8a8b-53325970e06e" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1144.708075] env[68617]: DEBUG nova.compute.manager [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Instance network_info: |[{"id": "d015097b-b390-48d4-95a8-e5a6581af921", "address": "fa:16:3e:f6:23:f0", "network": {"id": "e1f9ca9c-cc82-4132-a12d-8e55cdae0d1d", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-936932594-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bb40c285c7b849f4959da3d6b0428062", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d6fab536-1e48-4d07-992a-076f0e6d089c", "external-id": "nsx-vlan-transportzone-61", "segmentation_id": 61, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd015097b-b3", "ovs_interfaceid": "d015097b-b390-48d4-95a8-e5a6581af921", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1144.708467] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f6:23:f0', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd6fab536-1e48-4d07-992a-076f0e6d089c', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd015097b-b390-48d4-95a8-e5a6581af921', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1144.718908] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Creating folder: Project (bb40c285c7b849f4959da3d6b0428062). Parent ref: group-v693691. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1144.721551] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5953f545-4714-46d5-b5b0-65fa3adc73b7 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1144.734481] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Created folder: Project (bb40c285c7b849f4959da3d6b0428062) in parent group-v693691. [ 1144.734713] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Creating folder: Instances. Parent ref: group-v693750. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1144.735058] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0f591964-a0f9-43f1-85ea-d31a8e3b7bd6 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1144.743655] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Created folder: Instances in parent group-v693750. [ 1144.747394] env[68617]: DEBUG oslo.service.loopingcall [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1144.747394] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1144.747394] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-9b9d730c-951b-4294-b828-7e3867e5dd47 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1144.764528] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1144.764528] env[68617]: value = "task-3470784" [ 1144.764528] env[68617]: _type = "Task" [ 1144.764528] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1144.772733] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470784, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1145.281291] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470784, 'name': CreateVM_Task, 'duration_secs': 0.332849} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1145.281291] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1145.282628] env[68617]: DEBUG oslo_vmware.service [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7068bfa-040e-494b-9410-898dbdc2d646 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1145.293267] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1145.293267] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1145.293267] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1145.293267] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-335af832-aa47-4af4-8956-72d01c15293e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1145.297576] env[68617]: DEBUG oslo_vmware.api [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Waiting for the task: (returnval){ [ 1145.297576] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52cfc44c-b50c-3044-280c-2592616c548a" [ 1145.297576] env[68617]: _type = "Task" [ 1145.297576] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1145.306460] env[68617]: DEBUG oslo_vmware.api [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52cfc44c-b50c-3044-280c-2592616c548a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1145.707940] env[68617]: DEBUG oslo_concurrency.lockutils [None req-428077de-3a63-4b0c-a517-f64d25193b26 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "fa9b2716-783b-4b19-bfc9-aad609c3a659" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1145.708186] env[68617]: DEBUG oslo_concurrency.lockutils [None req-428077de-3a63-4b0c-a517-f64d25193b26 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "fa9b2716-783b-4b19-bfc9-aad609c3a659" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1145.808828] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1145.809221] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1145.809466] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1145.809594] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1145.809778] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1145.810044] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-26b89e14-1290-4924-be44-b6f15b4afaa5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1145.826711] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1145.827069] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1145.827701] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48911f80-e482-42d2-a001-a7e8662445d6 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1145.834209] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-820eb7f9-4426-437d-8068-756b657f92bf {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1145.840026] env[68617]: DEBUG oslo_vmware.api [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Waiting for the task: (returnval){ [ 1145.840026] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]5269d968-b0bc-c1be-95c1-56988f266b16" [ 1145.840026] env[68617]: _type = "Task" [ 1145.840026] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1145.847740] env[68617]: DEBUG oslo_vmware.api [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]5269d968-b0bc-c1be-95c1-56988f266b16, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1146.129898] env[68617]: DEBUG nova.compute.manager [req-70dda9b7-73a8-49f3-ae6d-934054bf6002 req-2bb9fa5b-043e-4215-a196-2ae7884e466e service nova] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Received event network-changed-d015097b-b390-48d4-95a8-e5a6581af921 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1146.130567] env[68617]: DEBUG nova.compute.manager [req-70dda9b7-73a8-49f3-ae6d-934054bf6002 req-2bb9fa5b-043e-4215-a196-2ae7884e466e service nova] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Refreshing instance network info cache due to event network-changed-d015097b-b390-48d4-95a8-e5a6581af921. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1146.130860] env[68617]: DEBUG oslo_concurrency.lockutils [req-70dda9b7-73a8-49f3-ae6d-934054bf6002 req-2bb9fa5b-043e-4215-a196-2ae7884e466e service nova] Acquiring lock "refresh_cache-f560b4df-fb57-4f7b-8a8b-53325970e06e" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1146.131073] env[68617]: DEBUG oslo_concurrency.lockutils [req-70dda9b7-73a8-49f3-ae6d-934054bf6002 req-2bb9fa5b-043e-4215-a196-2ae7884e466e service nova] Acquired lock "refresh_cache-f560b4df-fb57-4f7b-8a8b-53325970e06e" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1146.131446] env[68617]: DEBUG nova.network.neutron [req-70dda9b7-73a8-49f3-ae6d-934054bf6002 req-2bb9fa5b-043e-4215-a196-2ae7884e466e service nova] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Refreshing network info cache for port d015097b-b390-48d4-95a8-e5a6581af921 {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1146.351547] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1146.352341] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Creating directory with path [datastore1] vmware_temp/77d89391-b373-4fed-b654-f6b18d3efb37/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1146.352694] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-de9366eb-2a1d-4ad4-a790-7e6f6737bc6b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1146.376206] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Created directory with path [datastore1] vmware_temp/77d89391-b373-4fed-b654-f6b18d3efb37/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1146.376206] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Fetch image to [datastore1] vmware_temp/77d89391-b373-4fed-b654-f6b18d3efb37/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1146.376206] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore1] vmware_temp/77d89391-b373-4fed-b654-f6b18d3efb37/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore1 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1146.376206] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fad46ea1-2a30-4a44-b8d2-d67e4796ea0d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1146.384629] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e362fdf5-d3f5-4602-8286-356ae357c024 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1146.396965] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b48fba91-6b02-4550-a5ac-50854add9071 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1146.439150] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90653a76-4d25-4d9f-81a5-372880cc7312 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1146.447564] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f5b6f9ff-af34-4b63-af37-04c42b0c8d45 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1146.471495] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore1 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1146.553851] env[68617]: DEBUG oslo_vmware.rw_handles [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/77d89391-b373-4fed-b654-f6b18d3efb37/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1146.619277] env[68617]: DEBUG nova.network.neutron [req-70dda9b7-73a8-49f3-ae6d-934054bf6002 req-2bb9fa5b-043e-4215-a196-2ae7884e466e service nova] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Updated VIF entry in instance network info cache for port d015097b-b390-48d4-95a8-e5a6581af921. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1146.620483] env[68617]: DEBUG nova.network.neutron [req-70dda9b7-73a8-49f3-ae6d-934054bf6002 req-2bb9fa5b-043e-4215-a196-2ae7884e466e service nova] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Updating instance_info_cache with network_info: [{"id": "d015097b-b390-48d4-95a8-e5a6581af921", "address": "fa:16:3e:f6:23:f0", "network": {"id": "e1f9ca9c-cc82-4132-a12d-8e55cdae0d1d", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-936932594-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bb40c285c7b849f4959da3d6b0428062", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d6fab536-1e48-4d07-992a-076f0e6d089c", "external-id": "nsx-vlan-transportzone-61", "segmentation_id": 61, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd015097b-b3", "ovs_interfaceid": "d015097b-b390-48d4-95a8-e5a6581af921", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1146.623647] env[68617]: DEBUG oslo_vmware.rw_handles [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1146.623811] env[68617]: DEBUG oslo_vmware.rw_handles [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/77d89391-b373-4fed-b654-f6b18d3efb37/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1146.636695] env[68617]: DEBUG oslo_concurrency.lockutils [req-70dda9b7-73a8-49f3-ae6d-934054bf6002 req-2bb9fa5b-043e-4215-a196-2ae7884e466e service nova] Releasing lock "refresh_cache-f560b4df-fb57-4f7b-8a8b-53325970e06e" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1146.907873] env[68617]: DEBUG oslo_concurrency.lockutils [None req-514c2fd6-8e81-4a02-9b43-0cce1a26c8db tempest-ServerActionsV293TestJSON-754830659 tempest-ServerActionsV293TestJSON-754830659-project-member] Acquiring lock "dd611e75-aac1-4cdb-b263-6956d6254743" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1146.908206] env[68617]: DEBUG oslo_concurrency.lockutils [None req-514c2fd6-8e81-4a02-9b43-0cce1a26c8db tempest-ServerActionsV293TestJSON-754830659 tempest-ServerActionsV293TestJSON-754830659-project-member] Lock "dd611e75-aac1-4cdb-b263-6956d6254743" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1150.174789] env[68617]: DEBUG oslo_concurrency.lockutils [None req-21a980d9-c4a6-49bd-8bad-a84fc36b0223 tempest-AttachVolumeTestJSON-339037198 tempest-AttachVolumeTestJSON-339037198-project-member] Acquiring lock "075eb6cb-a53b-44d9-986d-bc85d4b8ac25" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1150.175101] env[68617]: DEBUG oslo_concurrency.lockutils [None req-21a980d9-c4a6-49bd-8bad-a84fc36b0223 tempest-AttachVolumeTestJSON-339037198 tempest-AttachVolumeTestJSON-339037198-project-member] Lock "075eb6cb-a53b-44d9-986d-bc85d4b8ac25" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1153.760230] env[68617]: DEBUG oslo_concurrency.lockutils [None req-b78ff9d6-5247-44ba-96c0-619c412e50d9 tempest-AttachInterfacesTestJSON-753337404 tempest-AttachInterfacesTestJSON-753337404-project-member] Acquiring lock "65014c6f-8b4e-4468-9462-4b8cdc08af73" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1153.760230] env[68617]: DEBUG oslo_concurrency.lockutils [None req-b78ff9d6-5247-44ba-96c0-619c412e50d9 tempest-AttachInterfacesTestJSON-753337404 tempest-AttachInterfacesTestJSON-753337404-project-member] Lock "65014c6f-8b4e-4468-9462-4b8cdc08af73" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1155.186770] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a6159447-108b-431a-a879-8a4ec5c03363 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquiring lock "7e1c7e8a-139e-4e8a-a3e1-39a2d7c3fc47" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1155.186770] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a6159447-108b-431a-a879-8a4ec5c03363 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Lock "7e1c7e8a-139e-4e8a-a3e1-39a2d7c3fc47" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1156.297435] env[68617]: DEBUG oslo_concurrency.lockutils [None req-4261bb17-68ab-4b31-99d3-638d8a02ef5f tempest-ImagesTestJSON-918330909 tempest-ImagesTestJSON-918330909-project-member] Acquiring lock "2bffd2c4-f290-4df6-b7b6-6dd963befdab" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1156.297721] env[68617]: DEBUG oslo_concurrency.lockutils [None req-4261bb17-68ab-4b31-99d3-638d8a02ef5f tempest-ImagesTestJSON-918330909 tempest-ImagesTestJSON-918330909-project-member] Lock "2bffd2c4-f290-4df6-b7b6-6dd963befdab" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1164.657618] env[68617]: DEBUG oslo_concurrency.lockutils [None req-15c8b2c8-3704-406d-85e8-cb2c2467602c tempest-ServerActionsTestOtherB-1124123640 tempest-ServerActionsTestOtherB-1124123640-project-member] Acquiring lock "13d6e00b-3c18-4346-b229-b56bdaba2dc8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1164.657905] env[68617]: DEBUG oslo_concurrency.lockutils [None req-15c8b2c8-3704-406d-85e8-cb2c2467602c tempest-ServerActionsTestOtherB-1124123640 tempest-ServerActionsTestOtherB-1124123640-project-member] Lock "13d6e00b-3c18-4346-b229-b56bdaba2dc8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1185.615854] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1185.615854] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Starting heal instance info cache {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1185.615854] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Rebuilding the list of instances to heal {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1185.638218] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1185.638470] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1185.638612] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1185.638740] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1185.638862] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1185.638979] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1185.639116] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1185.639234] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1185.639349] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1185.639469] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1185.639586] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Didn't find any instances for network info cache update. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1186.698633] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1186.698918] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1187.699220] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1187.770486] env[68617]: WARNING oslo_vmware.rw_handles [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1187.770486] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1187.770486] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1187.770486] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1187.770486] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1187.770486] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 1187.770486] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1187.770486] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1187.770486] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1187.770486] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1187.770486] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1187.770486] env[68617]: ERROR oslo_vmware.rw_handles [ 1187.770486] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/e9eb4fb6-dcae-437e-8e45-7f7d0ade28ce/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1187.771486] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1187.771724] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Copying Virtual Disk [datastore2] vmware_temp/e9eb4fb6-dcae-437e-8e45-7f7d0ade28ce/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/e9eb4fb6-dcae-437e-8e45-7f7d0ade28ce/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1187.772010] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-364f3ba1-2565-422f-9afd-af6196579cb9 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1187.784786] env[68617]: DEBUG oslo_vmware.api [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Waiting for the task: (returnval){ [ 1187.784786] env[68617]: value = "task-3470785" [ 1187.784786] env[68617]: _type = "Task" [ 1187.784786] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1187.792705] env[68617]: DEBUG oslo_vmware.api [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Task: {'id': task-3470785, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1188.296299] env[68617]: DEBUG oslo_vmware.exceptions [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1188.296431] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1188.297146] env[68617]: ERROR nova.compute.manager [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1188.297146] env[68617]: Faults: ['InvalidArgument'] [ 1188.297146] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Traceback (most recent call last): [ 1188.297146] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1188.297146] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] yield resources [ 1188.297146] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1188.297146] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] self.driver.spawn(context, instance, image_meta, [ 1188.297146] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1188.297146] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1188.297146] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1188.297146] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] self._fetch_image_if_missing(context, vi) [ 1188.297146] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1188.297776] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] image_cache(vi, tmp_image_ds_loc) [ 1188.297776] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1188.297776] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] vm_util.copy_virtual_disk( [ 1188.297776] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1188.297776] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] session._wait_for_task(vmdk_copy_task) [ 1188.297776] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1188.297776] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] return self.wait_for_task(task_ref) [ 1188.297776] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1188.297776] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] return evt.wait() [ 1188.297776] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1188.297776] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] result = hub.switch() [ 1188.297776] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1188.297776] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] return self.greenlet.switch() [ 1188.298261] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1188.298261] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] self.f(*self.args, **self.kw) [ 1188.298261] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1188.298261] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] raise exceptions.translate_fault(task_info.error) [ 1188.298261] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1188.298261] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Faults: ['InvalidArgument'] [ 1188.298261] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] [ 1188.298261] env[68617]: INFO nova.compute.manager [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Terminating instance [ 1188.298919] env[68617]: DEBUG oslo_concurrency.lockutils [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1188.299142] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1188.299392] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-dc8cd368-ae8f-442e-aff9-fe36b255e506 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1188.301818] env[68617]: DEBUG nova.compute.manager [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1188.302015] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1188.303085] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d79e3633-100f-41c8-906d-8c0d2f926dbd {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1188.311164] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1188.311164] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-69f6222c-4934-4755-8021-a58812944b0f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1188.313270] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1188.313346] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1188.314363] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9772bf03-4552-48cd-a8e2-93b14c6fd3ce {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1188.321174] env[68617]: DEBUG oslo_vmware.api [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Waiting for the task: (returnval){ [ 1188.321174] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]5294f1c5-838c-6a0c-351b-bcb12952bfff" [ 1188.321174] env[68617]: _type = "Task" [ 1188.321174] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1188.327301] env[68617]: DEBUG oslo_vmware.api [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]5294f1c5-838c-6a0c-351b-bcb12952bfff, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1188.699589] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1188.699829] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1188.830481] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1188.830830] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Creating directory with path [datastore2] vmware_temp/964df620-00e0-49d2-84e3-d804cf53d12e/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1188.831121] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-08267253-4dfd-427d-a820-fbcf8b5d28b5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1188.853010] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Created directory with path [datastore2] vmware_temp/964df620-00e0-49d2-84e3-d804cf53d12e/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1188.853676] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Fetch image to [datastore2] vmware_temp/964df620-00e0-49d2-84e3-d804cf53d12e/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1188.853903] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/964df620-00e0-49d2-84e3-d804cf53d12e/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1188.854751] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3c9f5f8-70a0-4324-a90f-7b55d4971bdf {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1188.863364] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae89964c-b5c2-4c83-95e2-77da71b3a453 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1188.874581] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3131bfc9-652b-4d90-a208-e397b7b4c30a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1188.908696] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18cea079-505d-4d57-9745-d94135dd2b77 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1188.915920] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-9332cbc7-81d5-4b3c-bd9e-878e3e544555 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1188.937203] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1189.003976] env[68617]: DEBUG oslo_vmware.rw_handles [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/964df620-00e0-49d2-84e3-d804cf53d12e/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1189.071794] env[68617]: DEBUG oslo_vmware.rw_handles [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1189.072501] env[68617]: DEBUG oslo_vmware.rw_handles [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/964df620-00e0-49d2-84e3-d804cf53d12e/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1189.698543] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1189.698749] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68617) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1190.694997] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1192.390742] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1192.391091] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1192.391903] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Deleting the datastore file [datastore2] 4ea5887f-84bd-4629-b568-e73c78af0ad4 {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1192.391903] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e14cb057-0d37-4a06-87c8-30b4a02bde92 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1192.397858] env[68617]: DEBUG oslo_vmware.api [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Waiting for the task: (returnval){ [ 1192.397858] env[68617]: value = "task-3470787" [ 1192.397858] env[68617]: _type = "Task" [ 1192.397858] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1192.405933] env[68617]: DEBUG oslo_vmware.api [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Task: {'id': task-3470787, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1192.908467] env[68617]: DEBUG oslo_vmware.api [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Task: {'id': task-3470787, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.148878} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1192.908961] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1192.909286] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1192.909583] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1192.909874] env[68617]: INFO nova.compute.manager [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Took 4.61 seconds to destroy the instance on the hypervisor. [ 1192.912481] env[68617]: DEBUG nova.compute.claims [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1192.912779] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1192.913132] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1193.372849] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b9d17bd-86a3-490e-9ae2-cc7c4702fff6 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1193.381410] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e484a0fe-5745-427c-9e93-29736164c9e2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1193.425467] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e11fb540-88fc-4112-b606-c43ad3fe0ed6 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1193.434216] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4aee7589-2b26-441f-ba2d-49ab4773d2c3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1193.448584] env[68617]: DEBUG nova.compute.provider_tree [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1193.463472] env[68617]: DEBUG nova.scheduler.client.report [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1193.482290] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.569s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1193.482923] env[68617]: ERROR nova.compute.manager [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1193.482923] env[68617]: Faults: ['InvalidArgument'] [ 1193.482923] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Traceback (most recent call last): [ 1193.482923] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1193.482923] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] self.driver.spawn(context, instance, image_meta, [ 1193.482923] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1193.482923] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1193.482923] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1193.482923] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] self._fetch_image_if_missing(context, vi) [ 1193.482923] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1193.482923] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] image_cache(vi, tmp_image_ds_loc) [ 1193.482923] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1193.483260] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] vm_util.copy_virtual_disk( [ 1193.483260] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1193.483260] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] session._wait_for_task(vmdk_copy_task) [ 1193.483260] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1193.483260] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] return self.wait_for_task(task_ref) [ 1193.483260] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1193.483260] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] return evt.wait() [ 1193.483260] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1193.483260] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] result = hub.switch() [ 1193.483260] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1193.483260] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] return self.greenlet.switch() [ 1193.483260] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1193.483260] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] self.f(*self.args, **self.kw) [ 1193.483577] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1193.483577] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] raise exceptions.translate_fault(task_info.error) [ 1193.483577] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1193.483577] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Faults: ['InvalidArgument'] [ 1193.483577] env[68617]: ERROR nova.compute.manager [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] [ 1193.483703] env[68617]: DEBUG nova.compute.utils [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1193.486632] env[68617]: DEBUG nova.compute.manager [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Build of instance 4ea5887f-84bd-4629-b568-e73c78af0ad4 was re-scheduled: A specified parameter was not correct: fileType [ 1193.486632] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1193.487063] env[68617]: DEBUG nova.compute.manager [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1193.487248] env[68617]: DEBUG nova.compute.manager [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1193.487432] env[68617]: DEBUG nova.compute.manager [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1193.487604] env[68617]: DEBUG nova.network.neutron [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1193.700287] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1193.710648] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1193.710895] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1193.711076] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1193.711254] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1193.712499] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a03c0278-a12b-488e-8c29-24dab9890fac {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1193.722424] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e71ba76f-2bc7-4f47-8433-409d1114f44c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1193.739447] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14a6541f-0c29-43b7-902c-abdd50fb89b8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1193.749029] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30e7a5b4-7b7a-4f2a-accd-d5f3088fd452 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1193.786150] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180904MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1193.786326] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1193.786542] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1193.890137] env[68617]: DEBUG nova.network.neutron [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1193.908748] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 4ea5887f-84bd-4629-b568-e73c78af0ad4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1193.909010] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6eef6e24-cf49-458b-ae37-8da4e02045f8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1193.909210] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 71b1ebba-2019-4378-9bd2-98a7559c22e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1193.909388] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance e6b6cbdd-11d6-44a6-8da7-98e0f52cef67 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1193.910282] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b27ace75-e2fa-4acc-96cb-88dd49b89de5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1193.910444] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 995585f5-57a4-4ba6-9e28-18a086af264c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1193.910574] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 82864ac3-a199-478c-8c57-97ea0a256201 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1193.910696] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1193.910813] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 79c92a1b-20ef-4360-93b4-913cbfcf92fe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1193.910930] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f560b4df-fb57-4f7b-8a8b-53325970e06e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1193.912468] env[68617]: INFO nova.compute.manager [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Took 0.42 seconds to deallocate network for instance. [ 1193.924780] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 1cc42c7f-8781-40b0-9f75-edfef3bc90e7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1193.947450] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance d46ca6f3-0ee9-412c-98b4-f639ce4f9228 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1193.986236] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance a8ff6232-530c-453a-96e4-f8ce00f976e3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1193.998615] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1194.021203] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 030eceb1-51a5-4e34-ad67-727b7ebd524f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1194.033330] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 07927d19-2354-4215-b89d-5920e20b8222 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1194.046102] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 59df690b-bfbb-4976-b80b-60106c53ba25 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1194.057188] env[68617]: INFO nova.scheduler.client.report [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Deleted allocations for instance 4ea5887f-84bd-4629-b568-e73c78af0ad4 [ 1194.064756] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 98b47fc9-678d-4c60-b9e5-78423719ae76 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1194.079887] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance e90877a8-47d3-47d7-8362-5bcfe3a98c36 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1194.092620] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance c5764a1d-3370-4756-ada0-03b503368d17 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1194.104308] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1c8ffd06-d09a-4972-acbd-931915c53e95 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Lock "4ea5887f-84bd-4629-b568-e73c78af0ad4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 487.191s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1194.106488] env[68617]: DEBUG oslo_concurrency.lockutils [None req-56391919-a768-45ee-beff-9807502f0df1 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Lock "4ea5887f-84bd-4629-b568-e73c78af0ad4" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 87.014s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1194.106720] env[68617]: DEBUG oslo_concurrency.lockutils [None req-56391919-a768-45ee-beff-9807502f0df1 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquiring lock "4ea5887f-84bd-4629-b568-e73c78af0ad4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1194.107187] env[68617]: DEBUG oslo_concurrency.lockutils [None req-56391919-a768-45ee-beff-9807502f0df1 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Lock "4ea5887f-84bd-4629-b568-e73c78af0ad4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1194.107187] env[68617]: DEBUG oslo_concurrency.lockutils [None req-56391919-a768-45ee-beff-9807502f0df1 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Lock "4ea5887f-84bd-4629-b568-e73c78af0ad4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1194.110789] env[68617]: INFO nova.compute.manager [None req-56391919-a768-45ee-beff-9807502f0df1 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Terminating instance [ 1194.113274] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance c0528a20-34cb-4b51-bb4c-8c3828021a85 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1194.114565] env[68617]: DEBUG nova.compute.manager [None req-56391919-a768-45ee-beff-9807502f0df1 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1194.114822] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-56391919-a768-45ee-beff-9807502f0df1 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1194.115455] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-9f33cc2c-34ad-4f29-a76c-81bde5b22046 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1194.118519] env[68617]: DEBUG nova.compute.manager [None req-75f49065-4ad5-4a89-990d-f4fdd6a5a0a5 tempest-AttachInterfacesTestJSON-753337404 tempest-AttachInterfacesTestJSON-753337404-project-member] [instance: e4ac9902-3e8b-4790-a00b-2fd45f16ff63] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1194.129808] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1a5184f-07db-4290-8456-9d4f63d9839d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1194.146498] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance fa9b2716-783b-4b19-bfc9-aad609c3a659 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1194.148097] env[68617]: DEBUG nova.compute.manager [None req-75f49065-4ad5-4a89-990d-f4fdd6a5a0a5 tempest-AttachInterfacesTestJSON-753337404 tempest-AttachInterfacesTestJSON-753337404-project-member] [instance: e4ac9902-3e8b-4790-a00b-2fd45f16ff63] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1194.168575] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-56391919-a768-45ee-beff-9807502f0df1 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4ea5887f-84bd-4629-b568-e73c78af0ad4 could not be found. [ 1194.168787] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-56391919-a768-45ee-beff-9807502f0df1 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1194.169087] env[68617]: INFO nova.compute.manager [None req-56391919-a768-45ee-beff-9807502f0df1 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1194.169325] env[68617]: DEBUG oslo.service.loopingcall [None req-56391919-a768-45ee-beff-9807502f0df1 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1194.171804] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance dd611e75-aac1-4cdb-b263-6956d6254743 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1194.172959] env[68617]: DEBUG nova.compute.manager [-] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1194.173072] env[68617]: DEBUG nova.network.neutron [-] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1194.182596] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 075eb6cb-a53b-44d9-986d-bc85d4b8ac25 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1194.185489] env[68617]: DEBUG oslo_concurrency.lockutils [None req-75f49065-4ad5-4a89-990d-f4fdd6a5a0a5 tempest-AttachInterfacesTestJSON-753337404 tempest-AttachInterfacesTestJSON-753337404-project-member] Lock "e4ac9902-3e8b-4790-a00b-2fd45f16ff63" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 239.342s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1194.194531] env[68617]: DEBUG nova.compute.manager [None req-e0dddcde-5965-4448-b33a-0c88fdb64fe2 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 85bfa486-9f65-40d6-a392-54fdf87da1a1] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1194.199502] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 65014c6f-8b4e-4468-9462-4b8cdc08af73 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1194.205321] env[68617]: DEBUG nova.network.neutron [-] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1194.216013] env[68617]: INFO nova.compute.manager [-] [instance: 4ea5887f-84bd-4629-b568-e73c78af0ad4] Took 0.04 seconds to deallocate network for instance. [ 1194.224272] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 7e1c7e8a-139e-4e8a-a3e1-39a2d7c3fc47 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1194.230956] env[68617]: DEBUG nova.compute.manager [None req-e0dddcde-5965-4448-b33a-0c88fdb64fe2 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 85bfa486-9f65-40d6-a392-54fdf87da1a1] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1194.235941] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 2bffd2c4-f290-4df6-b7b6-6dd963befdab has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1194.251401] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 13d6e00b-3c18-4346-b229-b56bdaba2dc8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1194.251711] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1194.251960] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1728MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1194.274314] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e0dddcde-5965-4448-b33a-0c88fdb64fe2 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Lock "85bfa486-9f65-40d6-a392-54fdf87da1a1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 237.610s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1194.289799] env[68617]: DEBUG nova.compute.manager [None req-31397e9d-74f9-4277-b762-6589b4c28702 tempest-ImagesTestJSON-918330909 tempest-ImagesTestJSON-918330909-project-member] [instance: f63c673e-40dc-49d3-b356-85629ada1101] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1194.338223] env[68617]: DEBUG oslo_concurrency.lockutils [None req-56391919-a768-45ee-beff-9807502f0df1 tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Lock "4ea5887f-84bd-4629-b568-e73c78af0ad4" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.232s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1194.349893] env[68617]: DEBUG nova.compute.manager [None req-31397e9d-74f9-4277-b762-6589b4c28702 tempest-ImagesTestJSON-918330909 tempest-ImagesTestJSON-918330909-project-member] [instance: f63c673e-40dc-49d3-b356-85629ada1101] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1194.373949] env[68617]: DEBUG oslo_concurrency.lockutils [None req-31397e9d-74f9-4277-b762-6589b4c28702 tempest-ImagesTestJSON-918330909 tempest-ImagesTestJSON-918330909-project-member] Lock "f63c673e-40dc-49d3-b356-85629ada1101" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 236.048s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1194.388653] env[68617]: DEBUG nova.compute.manager [None req-399a0f64-0267-46ff-b331-fe461e69dc70 tempest-ServerActionsTestJSON-789019370 tempest-ServerActionsTestJSON-789019370-project-member] [instance: d3b6336e-4baa-426e-a31d-9788cd2131a0] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1194.432888] env[68617]: DEBUG nova.compute.manager [None req-399a0f64-0267-46ff-b331-fe461e69dc70 tempest-ServerActionsTestJSON-789019370 tempest-ServerActionsTestJSON-789019370-project-member] [instance: d3b6336e-4baa-426e-a31d-9788cd2131a0] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1194.470285] env[68617]: DEBUG oslo_concurrency.lockutils [None req-399a0f64-0267-46ff-b331-fe461e69dc70 tempest-ServerActionsTestJSON-789019370 tempest-ServerActionsTestJSON-789019370-project-member] Lock "d3b6336e-4baa-426e-a31d-9788cd2131a0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 235.400s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1194.486495] env[68617]: DEBUG nova.compute.manager [None req-12c64557-bf69-4e0e-af49-57136b751ce7 tempest-ServersNegativeTestJSON-272895408 tempest-ServersNegativeTestJSON-272895408-project-member] [instance: 21c0de14-cb70-4a41-954f-aaa904d1514a] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1194.521625] env[68617]: DEBUG nova.compute.manager [None req-12c64557-bf69-4e0e-af49-57136b751ce7 tempest-ServersNegativeTestJSON-272895408 tempest-ServersNegativeTestJSON-272895408-project-member] [instance: 21c0de14-cb70-4a41-954f-aaa904d1514a] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1194.550025] env[68617]: DEBUG oslo_concurrency.lockutils [None req-12c64557-bf69-4e0e-af49-57136b751ce7 tempest-ServersNegativeTestJSON-272895408 tempest-ServersNegativeTestJSON-272895408-project-member] Lock "21c0de14-cb70-4a41-954f-aaa904d1514a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 233.942s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1194.561685] env[68617]: DEBUG nova.compute.manager [None req-46c98673-4183-49ef-95be-7e9c465a3475 tempest-InstanceActionsV221TestJSON-2063899890 tempest-InstanceActionsV221TestJSON-2063899890-project-member] [instance: b5a088c8-429a-49b3-b330-315d15ace97f] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1194.594165] env[68617]: DEBUG nova.compute.manager [None req-46c98673-4183-49ef-95be-7e9c465a3475 tempest-InstanceActionsV221TestJSON-2063899890 tempest-InstanceActionsV221TestJSON-2063899890-project-member] [instance: b5a088c8-429a-49b3-b330-315d15ace97f] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1194.624112] env[68617]: DEBUG oslo_concurrency.lockutils [None req-46c98673-4183-49ef-95be-7e9c465a3475 tempest-InstanceActionsV221TestJSON-2063899890 tempest-InstanceActionsV221TestJSON-2063899890-project-member] Lock "b5a088c8-429a-49b3-b330-315d15ace97f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 230.633s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1194.642825] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1194.707609] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1194.749081] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd6b5445-4984-4423-bfd6-8dcf66ac7553 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1194.757032] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-002b3076-dfb9-44b3-88cf-17a384d34435 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1194.789272] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d58a5cc1-5a4d-4910-adfa-be95866e9529 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1194.793244] env[68617]: WARNING oslo_vmware.rw_handles [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1194.793244] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1194.793244] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1194.793244] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1194.793244] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1194.793244] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 1194.793244] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1194.793244] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1194.793244] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1194.793244] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1194.793244] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1194.793244] env[68617]: ERROR oslo_vmware.rw_handles [ 1194.793657] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/77d89391-b373-4fed-b654-f6b18d3efb37/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore1 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1194.796063] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1194.796063] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Copying Virtual Disk [datastore1] vmware_temp/77d89391-b373-4fed-b654-f6b18d3efb37/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore1] vmware_temp/77d89391-b373-4fed-b654-f6b18d3efb37/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1194.796063] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9b00406c-60d9-42a6-a627-bfddca4adeb7 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1194.801846] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6103fb84-78c8-41f8-9d74-609a254cb7ba {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1194.807215] env[68617]: DEBUG oslo_vmware.api [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Waiting for the task: (returnval){ [ 1194.807215] env[68617]: value = "task-3470788" [ 1194.807215] env[68617]: _type = "Task" [ 1194.807215] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1194.819326] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1194.826131] env[68617]: DEBUG oslo_vmware.api [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Task: {'id': task-3470788, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1194.828756] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1194.845252] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1194.845472] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.059s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1194.845757] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.139s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1194.847347] env[68617]: INFO nova.compute.claims [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1195.322255] env[68617]: DEBUG oslo_vmware.exceptions [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1195.322871] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1195.323268] env[68617]: ERROR nova.compute.manager [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1195.323268] env[68617]: Faults: ['InvalidArgument'] [ 1195.323268] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Traceback (most recent call last): [ 1195.323268] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1195.323268] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] yield resources [ 1195.323268] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1195.323268] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] self.driver.spawn(context, instance, image_meta, [ 1195.323268] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1195.323268] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1195.323268] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1195.323268] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] self._fetch_image_if_missing(context, vi) [ 1195.323268] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1195.323644] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] image_cache(vi, tmp_image_ds_loc) [ 1195.323644] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1195.323644] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] vm_util.copy_virtual_disk( [ 1195.323644] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1195.323644] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] session._wait_for_task(vmdk_copy_task) [ 1195.323644] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1195.323644] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] return self.wait_for_task(task_ref) [ 1195.323644] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1195.323644] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] return evt.wait() [ 1195.323644] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1195.323644] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] result = hub.switch() [ 1195.323644] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1195.323644] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] return self.greenlet.switch() [ 1195.323978] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1195.323978] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] self.f(*self.args, **self.kw) [ 1195.323978] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1195.323978] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] raise exceptions.translate_fault(task_info.error) [ 1195.323978] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1195.323978] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Faults: ['InvalidArgument'] [ 1195.323978] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] [ 1195.323978] env[68617]: INFO nova.compute.manager [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Terminating instance [ 1195.327233] env[68617]: DEBUG nova.compute.manager [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1195.327233] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1195.328239] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e5b0859-f9a1-4fdf-9f0f-ff9b25c8f4fd {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1195.339440] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1195.342338] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-df38cd66-9a7c-4cd3-9ad2-675237c484e8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1195.369398] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c1675d4-db9c-4165-a8ea-3d7a62ce8f17 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1195.376697] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab7bdcd2-b43f-4edf-aed3-0202633a00f0 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1195.417379] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5db921f-0d50-47e4-b36b-fb8ec4e8282c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1195.419931] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1195.419993] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Deleting contents of the VM from datastore datastore1 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1195.420230] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Deleting the datastore file [datastore1] f560b4df-fb57-4f7b-8a8b-53325970e06e {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1195.420503] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8c318453-26de-4220-a299-f85ee3624dec {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1195.428147] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3b598c4-aea4-4dd0-b584-d2275caab1d1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1195.432223] env[68617]: DEBUG oslo_vmware.api [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Waiting for the task: (returnval){ [ 1195.432223] env[68617]: value = "task-3470790" [ 1195.432223] env[68617]: _type = "Task" [ 1195.432223] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1195.444056] env[68617]: DEBUG nova.compute.provider_tree [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1195.450316] env[68617]: DEBUG oslo_vmware.api [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Task: {'id': task-3470790, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1195.456314] env[68617]: DEBUG nova.scheduler.client.report [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1195.473676] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.628s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1195.474320] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1195.520901] env[68617]: DEBUG nova.compute.utils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1195.522253] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1195.522467] env[68617]: DEBUG nova.network.neutron [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1195.535878] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1195.600718] env[68617]: DEBUG nova.policy [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6a52ac4298854c2481284a1d27a5e808', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2585c2453a3f41ac85950f43c05b7025', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 1195.604020] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1195.633696] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1195.633914] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1195.634120] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1195.634311] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1195.637427] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1195.637427] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1195.637427] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1195.637427] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1195.637427] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1195.637754] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1195.637754] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1195.637754] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b47c1d3e-6283-4ecb-bd4e-8d01397a7f18 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1195.644466] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71956d73-1d95-4fd3-a87c-7da1ccddac32 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1195.950232] env[68617]: DEBUG oslo_vmware.api [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Task: {'id': task-3470790, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079647} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1195.950232] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1195.950232] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Deleted contents of the VM from datastore datastore1 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1195.950232] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1195.950806] env[68617]: INFO nova.compute.manager [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1195.957295] env[68617]: DEBUG nova.compute.claims [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1195.957531] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1195.957837] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1195.974426] env[68617]: DEBUG nova.network.neutron [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Successfully created port: a8c9898d-e9e6-4134-b06f-fb23e04bdf4b {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1196.100285] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Acquiring lock "f03b9bc5-9438-4c0c-b595-72c631bece08" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1196.100551] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Lock "f03b9bc5-9438-4c0c-b595-72c631bece08" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1196.439066] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4988b14-fa38-412e-bb57-6975602a5685 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1196.448669] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ff571a6-44d5-4fa2-a7dd-5b3f9f8c8481 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1196.485892] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b33d2f0e-532b-4cbe-8e10-a91019e75571 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1196.495794] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51a5ac6f-f493-4d94-8781-1884873db9a7 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1196.512010] env[68617]: DEBUG nova.compute.provider_tree [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1196.525155] env[68617]: DEBUG nova.scheduler.client.report [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1196.550729] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.593s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1196.551463] env[68617]: ERROR nova.compute.manager [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1196.551463] env[68617]: Faults: ['InvalidArgument'] [ 1196.551463] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Traceback (most recent call last): [ 1196.551463] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1196.551463] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] self.driver.spawn(context, instance, image_meta, [ 1196.551463] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1196.551463] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1196.551463] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1196.551463] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] self._fetch_image_if_missing(context, vi) [ 1196.551463] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1196.551463] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] image_cache(vi, tmp_image_ds_loc) [ 1196.551463] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1196.551842] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] vm_util.copy_virtual_disk( [ 1196.551842] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1196.551842] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] session._wait_for_task(vmdk_copy_task) [ 1196.551842] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1196.551842] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] return self.wait_for_task(task_ref) [ 1196.551842] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1196.551842] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] return evt.wait() [ 1196.551842] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1196.551842] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] result = hub.switch() [ 1196.551842] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1196.551842] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] return self.greenlet.switch() [ 1196.551842] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1196.551842] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] self.f(*self.args, **self.kw) [ 1196.552206] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1196.552206] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] raise exceptions.translate_fault(task_info.error) [ 1196.552206] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1196.552206] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Faults: ['InvalidArgument'] [ 1196.552206] env[68617]: ERROR nova.compute.manager [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] [ 1196.552206] env[68617]: DEBUG nova.compute.utils [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1196.553711] env[68617]: DEBUG nova.compute.manager [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Build of instance f560b4df-fb57-4f7b-8a8b-53325970e06e was re-scheduled: A specified parameter was not correct: fileType [ 1196.553711] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1196.554085] env[68617]: DEBUG nova.compute.manager [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1196.554277] env[68617]: DEBUG nova.compute.manager [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1196.554453] env[68617]: DEBUG nova.compute.manager [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1196.554617] env[68617]: DEBUG nova.network.neutron [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1197.185565] env[68617]: DEBUG nova.compute.manager [req-5b4024ba-22db-40f8-8595-61611688ca4e req-ca5679fe-bcae-4b41-829d-84a8a8eaba9f service nova] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Received event network-vif-plugged-a8c9898d-e9e6-4134-b06f-fb23e04bdf4b {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1197.185782] env[68617]: DEBUG oslo_concurrency.lockutils [req-5b4024ba-22db-40f8-8595-61611688ca4e req-ca5679fe-bcae-4b41-829d-84a8a8eaba9f service nova] Acquiring lock "1cc42c7f-8781-40b0-9f75-edfef3bc90e7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1197.185988] env[68617]: DEBUG oslo_concurrency.lockutils [req-5b4024ba-22db-40f8-8595-61611688ca4e req-ca5679fe-bcae-4b41-829d-84a8a8eaba9f service nova] Lock "1cc42c7f-8781-40b0-9f75-edfef3bc90e7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1197.186168] env[68617]: DEBUG oslo_concurrency.lockutils [req-5b4024ba-22db-40f8-8595-61611688ca4e req-ca5679fe-bcae-4b41-829d-84a8a8eaba9f service nova] Lock "1cc42c7f-8781-40b0-9f75-edfef3bc90e7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1197.186506] env[68617]: DEBUG nova.compute.manager [req-5b4024ba-22db-40f8-8595-61611688ca4e req-ca5679fe-bcae-4b41-829d-84a8a8eaba9f service nova] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] No waiting events found dispatching network-vif-plugged-a8c9898d-e9e6-4134-b06f-fb23e04bdf4b {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1197.186698] env[68617]: WARNING nova.compute.manager [req-5b4024ba-22db-40f8-8595-61611688ca4e req-ca5679fe-bcae-4b41-829d-84a8a8eaba9f service nova] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Received unexpected event network-vif-plugged-a8c9898d-e9e6-4134-b06f-fb23e04bdf4b for instance with vm_state building and task_state spawning. [ 1197.477687] env[68617]: DEBUG nova.network.neutron [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Successfully updated port: a8c9898d-e9e6-4134-b06f-fb23e04bdf4b {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1197.490069] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquiring lock "refresh_cache-1cc42c7f-8781-40b0-9f75-edfef3bc90e7" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1197.490253] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquired lock "refresh_cache-1cc42c7f-8781-40b0-9f75-edfef3bc90e7" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1197.490415] env[68617]: DEBUG nova.network.neutron [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1197.536093] env[68617]: DEBUG nova.network.neutron [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1197.553654] env[68617]: INFO nova.compute.manager [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Took 1.00 seconds to deallocate network for instance. [ 1197.558741] env[68617]: DEBUG nova.network.neutron [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1197.678758] env[68617]: INFO nova.scheduler.client.report [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Deleted allocations for instance f560b4df-fb57-4f7b-8a8b-53325970e06e [ 1197.717188] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5b93fb59-67f6-4b98-a2d8-5cac01943f54 tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Lock "f560b4df-fb57-4f7b-8a8b-53325970e06e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 251.203s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1197.718324] env[68617]: DEBUG oslo_concurrency.lockutils [None req-736e2c4f-639a-4754-a01a-e2e8f03ac09f tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Lock "f560b4df-fb57-4f7b-8a8b-53325970e06e" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 54.468s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1197.718547] env[68617]: DEBUG oslo_concurrency.lockutils [None req-736e2c4f-639a-4754-a01a-e2e8f03ac09f tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Acquiring lock "f560b4df-fb57-4f7b-8a8b-53325970e06e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1197.718752] env[68617]: DEBUG oslo_concurrency.lockutils [None req-736e2c4f-639a-4754-a01a-e2e8f03ac09f tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Lock "f560b4df-fb57-4f7b-8a8b-53325970e06e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1197.718915] env[68617]: DEBUG oslo_concurrency.lockutils [None req-736e2c4f-639a-4754-a01a-e2e8f03ac09f tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Lock "f560b4df-fb57-4f7b-8a8b-53325970e06e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1197.723995] env[68617]: INFO nova.compute.manager [None req-736e2c4f-639a-4754-a01a-e2e8f03ac09f tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Terminating instance [ 1197.726144] env[68617]: DEBUG oslo_concurrency.lockutils [None req-736e2c4f-639a-4754-a01a-e2e8f03ac09f tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Acquiring lock "refresh_cache-f560b4df-fb57-4f7b-8a8b-53325970e06e" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1197.726318] env[68617]: DEBUG oslo_concurrency.lockutils [None req-736e2c4f-639a-4754-a01a-e2e8f03ac09f tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Acquired lock "refresh_cache-f560b4df-fb57-4f7b-8a8b-53325970e06e" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1197.726482] env[68617]: DEBUG nova.network.neutron [None req-736e2c4f-639a-4754-a01a-e2e8f03ac09f tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1197.736057] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1197.763976] env[68617]: DEBUG nova.network.neutron [None req-736e2c4f-639a-4754-a01a-e2e8f03ac09f tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1197.789110] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1197.789368] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1197.790862] env[68617]: INFO nova.compute.claims [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1197.890255] env[68617]: DEBUG nova.network.neutron [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Updating instance_info_cache with network_info: [{"id": "a8c9898d-e9e6-4134-b06f-fb23e04bdf4b", "address": "fa:16:3e:a5:64:8e", "network": {"id": "62b5016e-8314-483a-87c5-3175a0b6a0eb", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1469879935-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2585c2453a3f41ac85950f43c05b7025", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2ed91b7b-b4ec-486d-ab34-af0afb7ec691", "external-id": "nsx-vlan-transportzone-75", "segmentation_id": 75, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa8c9898d-e9", "ovs_interfaceid": "a8c9898d-e9e6-4134-b06f-fb23e04bdf4b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1197.911862] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Releasing lock "refresh_cache-1cc42c7f-8781-40b0-9f75-edfef3bc90e7" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1197.913364] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Instance network_info: |[{"id": "a8c9898d-e9e6-4134-b06f-fb23e04bdf4b", "address": "fa:16:3e:a5:64:8e", "network": {"id": "62b5016e-8314-483a-87c5-3175a0b6a0eb", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1469879935-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2585c2453a3f41ac85950f43c05b7025", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2ed91b7b-b4ec-486d-ab34-af0afb7ec691", "external-id": "nsx-vlan-transportzone-75", "segmentation_id": 75, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa8c9898d-e9", "ovs_interfaceid": "a8c9898d-e9e6-4134-b06f-fb23e04bdf4b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1197.913459] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a5:64:8e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2ed91b7b-b4ec-486d-ab34-af0afb7ec691', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a8c9898d-e9e6-4134-b06f-fb23e04bdf4b', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1197.921929] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Creating folder: Project (2585c2453a3f41ac85950f43c05b7025). Parent ref: group-v693691. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1197.925073] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f21ee87e-ce04-4951-94c4-39ae7954da1c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1197.935082] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Created folder: Project (2585c2453a3f41ac85950f43c05b7025) in parent group-v693691. [ 1197.935282] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Creating folder: Instances. Parent ref: group-v693753. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1197.935576] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-31718913-f0b4-4ebb-9543-87a829ec2aac {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1197.945066] env[68617]: DEBUG nova.network.neutron [None req-736e2c4f-639a-4754-a01a-e2e8f03ac09f tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1197.946903] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Created folder: Instances in parent group-v693753. [ 1197.947140] env[68617]: DEBUG oslo.service.loopingcall [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1197.947521] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1197.947715] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1e825429-e264-4050-8fa0-55f34a2579f5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1197.965589] env[68617]: DEBUG oslo_concurrency.lockutils [None req-736e2c4f-639a-4754-a01a-e2e8f03ac09f tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Releasing lock "refresh_cache-f560b4df-fb57-4f7b-8a8b-53325970e06e" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1197.965967] env[68617]: DEBUG nova.compute.manager [None req-736e2c4f-639a-4754-a01a-e2e8f03ac09f tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1197.966167] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-736e2c4f-639a-4754-a01a-e2e8f03ac09f tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1197.969592] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-144c6ec9-5a00-40b1-a98f-12e15f28b307 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1197.972756] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1197.972756] env[68617]: value = "task-3470793" [ 1197.972756] env[68617]: _type = "Task" [ 1197.972756] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1197.982877] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a331c639-9568-40fe-9d30-6b8fcd920814 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1197.998157] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470793, 'name': CreateVM_Task} progress is 5%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1198.016663] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-736e2c4f-639a-4754-a01a-e2e8f03ac09f tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f560b4df-fb57-4f7b-8a8b-53325970e06e could not be found. [ 1198.016867] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-736e2c4f-639a-4754-a01a-e2e8f03ac09f tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1198.017059] env[68617]: INFO nova.compute.manager [None req-736e2c4f-639a-4754-a01a-e2e8f03ac09f tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1198.017302] env[68617]: DEBUG oslo.service.loopingcall [None req-736e2c4f-639a-4754-a01a-e2e8f03ac09f tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1198.017744] env[68617]: DEBUG nova.compute.manager [-] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1198.017832] env[68617]: DEBUG nova.network.neutron [-] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1198.037460] env[68617]: DEBUG nova.network.neutron [-] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1198.049705] env[68617]: DEBUG nova.network.neutron [-] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1198.068749] env[68617]: INFO nova.compute.manager [-] [instance: f560b4df-fb57-4f7b-8a8b-53325970e06e] Took 0.05 seconds to deallocate network for instance. [ 1198.182696] env[68617]: DEBUG oslo_concurrency.lockutils [None req-736e2c4f-639a-4754-a01a-e2e8f03ac09f tempest-ServerAddressesNegativeTestJSON-1064956280 tempest-ServerAddressesNegativeTestJSON-1064956280-project-member] Lock "f560b4df-fb57-4f7b-8a8b-53325970e06e" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.464s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1198.409921] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-482362fb-c977-4250-b843-4ff4060fa11b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1198.419442] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58ad0345-17ea-417d-b71c-e0d9836aac96 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1198.460125] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bc62f0f-b04f-4c27-8d49-061c926161d7 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1198.469805] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b2d0c2d-aabe-4a34-99c8-92d35ec7e5b0 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1198.484058] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470793, 'name': CreateVM_Task, 'duration_secs': 0.299723} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1198.494391] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1198.494905] env[68617]: DEBUG nova.compute.provider_tree [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1198.496889] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1198.497090] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1198.497428] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1198.498010] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9426c386-364d-4511-ad38-de08147859b2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1198.502926] env[68617]: DEBUG oslo_vmware.api [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Waiting for the task: (returnval){ [ 1198.502926] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52e65d8a-9247-7532-157b-b6177da5bf54" [ 1198.502926] env[68617]: _type = "Task" [ 1198.502926] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1198.507408] env[68617]: DEBUG nova.scheduler.client.report [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1198.515633] env[68617]: DEBUG oslo_vmware.api [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52e65d8a-9247-7532-157b-b6177da5bf54, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1198.525171] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.735s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1198.525677] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1198.571985] env[68617]: DEBUG nova.compute.utils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1198.573414] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1198.573592] env[68617]: DEBUG nova.network.neutron [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1198.585415] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1198.672986] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1198.712684] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1198.713289] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1198.713289] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1198.713289] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1198.713474] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1198.713625] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1198.713837] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1198.713996] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1198.714182] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1198.714346] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1198.714517] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1198.715741] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e12af627-b9e7-4588-a6f8-39ee62af05df {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1198.729669] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd6e4746-24b4-492d-ae1f-a8df9fda769c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1198.863020] env[68617]: DEBUG nova.policy [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6a52ac4298854c2481284a1d27a5e808', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2585c2453a3f41ac85950f43c05b7025', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 1199.015308] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1199.015308] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1199.015308] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1199.219025] env[68617]: DEBUG nova.compute.manager [req-4f1a64a4-b9e9-4e03-a18d-5ba40cf5336e req-eada9f6f-01d8-4f8a-a415-6a91b77f012e service nova] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Received event network-changed-a8c9898d-e9e6-4134-b06f-fb23e04bdf4b {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1199.219239] env[68617]: DEBUG nova.compute.manager [req-4f1a64a4-b9e9-4e03-a18d-5ba40cf5336e req-eada9f6f-01d8-4f8a-a415-6a91b77f012e service nova] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Refreshing instance network info cache due to event network-changed-a8c9898d-e9e6-4134-b06f-fb23e04bdf4b. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1199.219452] env[68617]: DEBUG oslo_concurrency.lockutils [req-4f1a64a4-b9e9-4e03-a18d-5ba40cf5336e req-eada9f6f-01d8-4f8a-a415-6a91b77f012e service nova] Acquiring lock "refresh_cache-1cc42c7f-8781-40b0-9f75-edfef3bc90e7" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1199.219593] env[68617]: DEBUG oslo_concurrency.lockutils [req-4f1a64a4-b9e9-4e03-a18d-5ba40cf5336e req-eada9f6f-01d8-4f8a-a415-6a91b77f012e service nova] Acquired lock "refresh_cache-1cc42c7f-8781-40b0-9f75-edfef3bc90e7" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1199.219761] env[68617]: DEBUG nova.network.neutron [req-4f1a64a4-b9e9-4e03-a18d-5ba40cf5336e req-eada9f6f-01d8-4f8a-a415-6a91b77f012e service nova] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Refreshing network info cache for port a8c9898d-e9e6-4134-b06f-fb23e04bdf4b {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1199.484588] env[68617]: DEBUG nova.network.neutron [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Successfully created port: 64dfe27a-0fd7-40be-8aa3-22989b8a18b6 {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1199.865394] env[68617]: DEBUG nova.network.neutron [req-4f1a64a4-b9e9-4e03-a18d-5ba40cf5336e req-eada9f6f-01d8-4f8a-a415-6a91b77f012e service nova] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Updated VIF entry in instance network info cache for port a8c9898d-e9e6-4134-b06f-fb23e04bdf4b. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1199.865751] env[68617]: DEBUG nova.network.neutron [req-4f1a64a4-b9e9-4e03-a18d-5ba40cf5336e req-eada9f6f-01d8-4f8a-a415-6a91b77f012e service nova] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Updating instance_info_cache with network_info: [{"id": "a8c9898d-e9e6-4134-b06f-fb23e04bdf4b", "address": "fa:16:3e:a5:64:8e", "network": {"id": "62b5016e-8314-483a-87c5-3175a0b6a0eb", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1469879935-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2585c2453a3f41ac85950f43c05b7025", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2ed91b7b-b4ec-486d-ab34-af0afb7ec691", "external-id": "nsx-vlan-transportzone-75", "segmentation_id": 75, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa8c9898d-e9", "ovs_interfaceid": "a8c9898d-e9e6-4134-b06f-fb23e04bdf4b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1199.880722] env[68617]: DEBUG oslo_concurrency.lockutils [req-4f1a64a4-b9e9-4e03-a18d-5ba40cf5336e req-eada9f6f-01d8-4f8a-a415-6a91b77f012e service nova] Releasing lock "refresh_cache-1cc42c7f-8781-40b0-9f75-edfef3bc90e7" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1200.621801] env[68617]: DEBUG nova.network.neutron [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Successfully updated port: 64dfe27a-0fd7-40be-8aa3-22989b8a18b6 {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1200.635152] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquiring lock "refresh_cache-d46ca6f3-0ee9-412c-98b4-f639ce4f9228" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1200.635152] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquired lock "refresh_cache-d46ca6f3-0ee9-412c-98b4-f639ce4f9228" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1200.635152] env[68617]: DEBUG nova.network.neutron [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1200.680069] env[68617]: DEBUG nova.network.neutron [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1200.875351] env[68617]: DEBUG nova.network.neutron [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Updating instance_info_cache with network_info: [{"id": "64dfe27a-0fd7-40be-8aa3-22989b8a18b6", "address": "fa:16:3e:c0:53:15", "network": {"id": "62b5016e-8314-483a-87c5-3175a0b6a0eb", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1469879935-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2585c2453a3f41ac85950f43c05b7025", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2ed91b7b-b4ec-486d-ab34-af0afb7ec691", "external-id": "nsx-vlan-transportzone-75", "segmentation_id": 75, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap64dfe27a-0f", "ovs_interfaceid": "64dfe27a-0fd7-40be-8aa3-22989b8a18b6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1200.893705] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Releasing lock "refresh_cache-d46ca6f3-0ee9-412c-98b4-f639ce4f9228" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1200.894623] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Instance network_info: |[{"id": "64dfe27a-0fd7-40be-8aa3-22989b8a18b6", "address": "fa:16:3e:c0:53:15", "network": {"id": "62b5016e-8314-483a-87c5-3175a0b6a0eb", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1469879935-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2585c2453a3f41ac85950f43c05b7025", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2ed91b7b-b4ec-486d-ab34-af0afb7ec691", "external-id": "nsx-vlan-transportzone-75", "segmentation_id": 75, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap64dfe27a-0f", "ovs_interfaceid": "64dfe27a-0fd7-40be-8aa3-22989b8a18b6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1200.895464] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c0:53:15', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2ed91b7b-b4ec-486d-ab34-af0afb7ec691', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '64dfe27a-0fd7-40be-8aa3-22989b8a18b6', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1200.904206] env[68617]: DEBUG oslo.service.loopingcall [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1200.904737] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1200.905072] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a7c94374-b794-4d89-b223-f5e7d267da25 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1200.925462] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1200.925462] env[68617]: value = "task-3470794" [ 1200.925462] env[68617]: _type = "Task" [ 1200.925462] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1200.933910] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470794, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1201.439522] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470794, 'name': CreateVM_Task, 'duration_secs': 0.278393} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1201.439522] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1201.439522] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1201.439522] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1201.439522] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1201.439704] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-65238e22-8fdd-428d-9a3f-a15e84714c1f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1201.445560] env[68617]: DEBUG oslo_vmware.api [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Waiting for the task: (returnval){ [ 1201.445560] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52b12d1a-5adc-46f4-8f1d-14c544bea889" [ 1201.445560] env[68617]: _type = "Task" [ 1201.445560] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1201.456566] env[68617]: DEBUG oslo_vmware.api [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52b12d1a-5adc-46f4-8f1d-14c544bea889, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1201.460736] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a3936b72-dd94-4e40-a012-ddb78915f308 tempest-ServerGroupTestJSON-1648189536 tempest-ServerGroupTestJSON-1648189536-project-member] Acquiring lock "570302ee-2383-4659-80e1-af4b16d03a21" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1201.462014] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a3936b72-dd94-4e40-a012-ddb78915f308 tempest-ServerGroupTestJSON-1648189536 tempest-ServerGroupTestJSON-1648189536-project-member] Lock "570302ee-2383-4659-80e1-af4b16d03a21" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1201.560876] env[68617]: DEBUG nova.compute.manager [req-a81dac59-1ec5-4b1b-bbbe-6a14ec6b97f2 req-7a5fd13c-344b-4873-bd5d-8dfe5bb9843b service nova] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Received event network-vif-plugged-64dfe27a-0fd7-40be-8aa3-22989b8a18b6 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1201.560876] env[68617]: DEBUG oslo_concurrency.lockutils [req-a81dac59-1ec5-4b1b-bbbe-6a14ec6b97f2 req-7a5fd13c-344b-4873-bd5d-8dfe5bb9843b service nova] Acquiring lock "d46ca6f3-0ee9-412c-98b4-f639ce4f9228-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1201.560876] env[68617]: DEBUG oslo_concurrency.lockutils [req-a81dac59-1ec5-4b1b-bbbe-6a14ec6b97f2 req-7a5fd13c-344b-4873-bd5d-8dfe5bb9843b service nova] Lock "d46ca6f3-0ee9-412c-98b4-f639ce4f9228-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1201.560876] env[68617]: DEBUG oslo_concurrency.lockutils [req-a81dac59-1ec5-4b1b-bbbe-6a14ec6b97f2 req-7a5fd13c-344b-4873-bd5d-8dfe5bb9843b service nova] Lock "d46ca6f3-0ee9-412c-98b4-f639ce4f9228-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1201.561376] env[68617]: DEBUG nova.compute.manager [req-a81dac59-1ec5-4b1b-bbbe-6a14ec6b97f2 req-7a5fd13c-344b-4873-bd5d-8dfe5bb9843b service nova] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] No waiting events found dispatching network-vif-plugged-64dfe27a-0fd7-40be-8aa3-22989b8a18b6 {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1201.561376] env[68617]: WARNING nova.compute.manager [req-a81dac59-1ec5-4b1b-bbbe-6a14ec6b97f2 req-7a5fd13c-344b-4873-bd5d-8dfe5bb9843b service nova] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Received unexpected event network-vif-plugged-64dfe27a-0fd7-40be-8aa3-22989b8a18b6 for instance with vm_state building and task_state spawning. [ 1201.561376] env[68617]: DEBUG nova.compute.manager [req-a81dac59-1ec5-4b1b-bbbe-6a14ec6b97f2 req-7a5fd13c-344b-4873-bd5d-8dfe5bb9843b service nova] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Received event network-changed-64dfe27a-0fd7-40be-8aa3-22989b8a18b6 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1201.561376] env[68617]: DEBUG nova.compute.manager [req-a81dac59-1ec5-4b1b-bbbe-6a14ec6b97f2 req-7a5fd13c-344b-4873-bd5d-8dfe5bb9843b service nova] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Refreshing instance network info cache due to event network-changed-64dfe27a-0fd7-40be-8aa3-22989b8a18b6. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1201.561376] env[68617]: DEBUG oslo_concurrency.lockutils [req-a81dac59-1ec5-4b1b-bbbe-6a14ec6b97f2 req-7a5fd13c-344b-4873-bd5d-8dfe5bb9843b service nova] Acquiring lock "refresh_cache-d46ca6f3-0ee9-412c-98b4-f639ce4f9228" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1201.561579] env[68617]: DEBUG oslo_concurrency.lockutils [req-a81dac59-1ec5-4b1b-bbbe-6a14ec6b97f2 req-7a5fd13c-344b-4873-bd5d-8dfe5bb9843b service nova] Acquired lock "refresh_cache-d46ca6f3-0ee9-412c-98b4-f639ce4f9228" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1201.561579] env[68617]: DEBUG nova.network.neutron [req-a81dac59-1ec5-4b1b-bbbe-6a14ec6b97f2 req-7a5fd13c-344b-4873-bd5d-8dfe5bb9843b service nova] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Refreshing network info cache for port 64dfe27a-0fd7-40be-8aa3-22989b8a18b6 {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1201.956980] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1201.961195] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1201.961542] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1201.973565] env[68617]: DEBUG nova.network.neutron [req-a81dac59-1ec5-4b1b-bbbe-6a14ec6b97f2 req-7a5fd13c-344b-4873-bd5d-8dfe5bb9843b service nova] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Updated VIF entry in instance network info cache for port 64dfe27a-0fd7-40be-8aa3-22989b8a18b6. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1201.973906] env[68617]: DEBUG nova.network.neutron [req-a81dac59-1ec5-4b1b-bbbe-6a14ec6b97f2 req-7a5fd13c-344b-4873-bd5d-8dfe5bb9843b service nova] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Updating instance_info_cache with network_info: [{"id": "64dfe27a-0fd7-40be-8aa3-22989b8a18b6", "address": "fa:16:3e:c0:53:15", "network": {"id": "62b5016e-8314-483a-87c5-3175a0b6a0eb", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1469879935-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2585c2453a3f41ac85950f43c05b7025", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2ed91b7b-b4ec-486d-ab34-af0afb7ec691", "external-id": "nsx-vlan-transportzone-75", "segmentation_id": 75, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap64dfe27a-0f", "ovs_interfaceid": "64dfe27a-0fd7-40be-8aa3-22989b8a18b6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1201.983866] env[68617]: DEBUG oslo_concurrency.lockutils [req-a81dac59-1ec5-4b1b-bbbe-6a14ec6b97f2 req-7a5fd13c-344b-4873-bd5d-8dfe5bb9843b service nova] Releasing lock "refresh_cache-d46ca6f3-0ee9-412c-98b4-f639ce4f9228" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1206.653894] env[68617]: DEBUG oslo_concurrency.lockutils [None req-7f0054fc-a131-44a2-aa99-cf76b1d25111 tempest-ServerMetadataTestJSON-2102659189 tempest-ServerMetadataTestJSON-2102659189-project-member] Acquiring lock "7bf75617-fcd8-4d96-bf02-ddb723e8ad96" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1206.654285] env[68617]: DEBUG oslo_concurrency.lockutils [None req-7f0054fc-a131-44a2-aa99-cf76b1d25111 tempest-ServerMetadataTestJSON-2102659189 tempest-ServerMetadataTestJSON-2102659189-project-member] Lock "7bf75617-fcd8-4d96-bf02-ddb723e8ad96" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1234.805051] env[68617]: WARNING oslo_vmware.rw_handles [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1234.805051] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1234.805051] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1234.805051] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1234.805051] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1234.805051] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 1234.805051] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1234.805051] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1234.805051] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1234.805051] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1234.805051] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1234.805051] env[68617]: ERROR oslo_vmware.rw_handles [ 1234.805708] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/964df620-00e0-49d2-84e3-d804cf53d12e/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1234.807890] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1234.808196] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Copying Virtual Disk [datastore2] vmware_temp/964df620-00e0-49d2-84e3-d804cf53d12e/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/964df620-00e0-49d2-84e3-d804cf53d12e/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1234.808512] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-aba98ea5-9376-451d-9dc2-e4f407d0087f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1234.816684] env[68617]: DEBUG oslo_vmware.api [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Waiting for the task: (returnval){ [ 1234.816684] env[68617]: value = "task-3470795" [ 1234.816684] env[68617]: _type = "Task" [ 1234.816684] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1234.824875] env[68617]: DEBUG oslo_vmware.api [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Task: {'id': task-3470795, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1235.326641] env[68617]: DEBUG oslo_vmware.exceptions [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1235.326921] env[68617]: DEBUG oslo_concurrency.lockutils [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1235.327518] env[68617]: ERROR nova.compute.manager [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1235.327518] env[68617]: Faults: ['InvalidArgument'] [ 1235.327518] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Traceback (most recent call last): [ 1235.327518] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1235.327518] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] yield resources [ 1235.327518] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1235.327518] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] self.driver.spawn(context, instance, image_meta, [ 1235.327518] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1235.327518] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1235.327518] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1235.327518] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] self._fetch_image_if_missing(context, vi) [ 1235.327518] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1235.327894] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] image_cache(vi, tmp_image_ds_loc) [ 1235.327894] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1235.327894] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] vm_util.copy_virtual_disk( [ 1235.327894] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1235.327894] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] session._wait_for_task(vmdk_copy_task) [ 1235.327894] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1235.327894] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] return self.wait_for_task(task_ref) [ 1235.327894] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1235.327894] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] return evt.wait() [ 1235.327894] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1235.327894] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] result = hub.switch() [ 1235.327894] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1235.327894] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] return self.greenlet.switch() [ 1235.328227] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1235.328227] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] self.f(*self.args, **self.kw) [ 1235.328227] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1235.328227] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] raise exceptions.translate_fault(task_info.error) [ 1235.328227] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1235.328227] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Faults: ['InvalidArgument'] [ 1235.328227] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] [ 1235.328227] env[68617]: INFO nova.compute.manager [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Terminating instance [ 1235.329359] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1235.329572] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1235.329804] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c84f64a6-37b2-4a38-970f-340cf910d391 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1235.332039] env[68617]: DEBUG nova.compute.manager [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1235.332227] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1235.332931] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2beecac-2ce7-4385-a4ce-9801729f1d05 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1235.339787] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1235.339995] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c8580c4a-3132-4d3f-94a8-73a3b389d151 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1235.342149] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1235.342315] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1235.343264] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2f64da4f-df5f-4153-836d-9a715f4a40f8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1235.347603] env[68617]: DEBUG oslo_vmware.api [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Waiting for the task: (returnval){ [ 1235.347603] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52946b3d-6142-e140-89d7-b5cde8442672" [ 1235.347603] env[68617]: _type = "Task" [ 1235.347603] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1235.354635] env[68617]: DEBUG oslo_vmware.api [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52946b3d-6142-e140-89d7-b5cde8442672, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1235.557069] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1235.557213] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1235.557396] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Deleting the datastore file [datastore2] 6eef6e24-cf49-458b-ae37-8da4e02045f8 {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1235.557721] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d2c5d489-819e-40ce-9fbd-3f34a80171db {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1235.565006] env[68617]: DEBUG oslo_vmware.api [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Waiting for the task: (returnval){ [ 1235.565006] env[68617]: value = "task-3470797" [ 1235.565006] env[68617]: _type = "Task" [ 1235.565006] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1235.572737] env[68617]: DEBUG oslo_vmware.api [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Task: {'id': task-3470797, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1235.857261] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1235.857635] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Creating directory with path [datastore2] vmware_temp/789e0af7-649f-4a0a-9a13-8dd5797fc88f/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1235.857762] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d3eb9b4f-9f3b-4d8a-990a-5cc4cdc0db47 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1235.869447] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Created directory with path [datastore2] vmware_temp/789e0af7-649f-4a0a-9a13-8dd5797fc88f/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1235.869639] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Fetch image to [datastore2] vmware_temp/789e0af7-649f-4a0a-9a13-8dd5797fc88f/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1235.869806] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/789e0af7-649f-4a0a-9a13-8dd5797fc88f/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1235.870544] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ecf271b5-231a-4069-bf94-83140033a42b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1235.877117] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f622f317-4191-40dd-881f-ba14bef06112 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1235.885810] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a290d1b0-cbfe-4551-9b4c-59d6f7bd6aa6 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1235.916923] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e5ee5f3-c9d6-4fe4-9f3a-bfe9df56da4e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1235.922230] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-53e7f0c9-db64-4aae-9dd3-10bb0d59bed8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1235.940790] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1235.989529] env[68617]: DEBUG oslo_vmware.rw_handles [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/789e0af7-649f-4a0a-9a13-8dd5797fc88f/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1236.048945] env[68617]: DEBUG oslo_vmware.rw_handles [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1236.049060] env[68617]: DEBUG oslo_vmware.rw_handles [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/789e0af7-649f-4a0a-9a13-8dd5797fc88f/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1236.074266] env[68617]: DEBUG oslo_vmware.api [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Task: {'id': task-3470797, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.084582} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1236.074513] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1236.074696] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1236.074866] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1236.075051] env[68617]: INFO nova.compute.manager [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Took 0.74 seconds to destroy the instance on the hypervisor. [ 1236.077085] env[68617]: DEBUG nova.compute.claims [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1236.077257] env[68617]: DEBUG oslo_concurrency.lockutils [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1236.077462] env[68617]: DEBUG oslo_concurrency.lockutils [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1236.454778] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aae2dd4f-0e34-4b7e-a50d-9a44ba1020dc {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1236.462485] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6aae333-bbb5-40d4-81f1-db0ed500ea31 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1236.491394] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdba2a4b-b739-4880-945a-5b4eda43a44e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1236.497936] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c18a159-625d-4c98-888e-b78d47136b43 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1236.511286] env[68617]: DEBUG nova.compute.provider_tree [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1236.519577] env[68617]: DEBUG nova.scheduler.client.report [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1236.536784] env[68617]: DEBUG oslo_concurrency.lockutils [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.459s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1236.537386] env[68617]: ERROR nova.compute.manager [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1236.537386] env[68617]: Faults: ['InvalidArgument'] [ 1236.537386] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Traceback (most recent call last): [ 1236.537386] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1236.537386] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] self.driver.spawn(context, instance, image_meta, [ 1236.537386] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1236.537386] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1236.537386] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1236.537386] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] self._fetch_image_if_missing(context, vi) [ 1236.537386] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1236.537386] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] image_cache(vi, tmp_image_ds_loc) [ 1236.537386] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1236.537705] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] vm_util.copy_virtual_disk( [ 1236.537705] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1236.537705] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] session._wait_for_task(vmdk_copy_task) [ 1236.537705] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1236.537705] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] return self.wait_for_task(task_ref) [ 1236.537705] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1236.537705] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] return evt.wait() [ 1236.537705] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1236.537705] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] result = hub.switch() [ 1236.537705] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1236.537705] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] return self.greenlet.switch() [ 1236.537705] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1236.537705] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] self.f(*self.args, **self.kw) [ 1236.538192] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1236.538192] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] raise exceptions.translate_fault(task_info.error) [ 1236.538192] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1236.538192] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Faults: ['InvalidArgument'] [ 1236.538192] env[68617]: ERROR nova.compute.manager [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] [ 1236.538192] env[68617]: DEBUG nova.compute.utils [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1236.542783] env[68617]: DEBUG nova.compute.manager [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Build of instance 6eef6e24-cf49-458b-ae37-8da4e02045f8 was re-scheduled: A specified parameter was not correct: fileType [ 1236.542783] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1236.543193] env[68617]: DEBUG nova.compute.manager [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1236.543381] env[68617]: DEBUG nova.compute.manager [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1236.543547] env[68617]: DEBUG nova.compute.manager [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1236.543707] env[68617]: DEBUG nova.network.neutron [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1236.879882] env[68617]: DEBUG nova.network.neutron [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1236.890300] env[68617]: INFO nova.compute.manager [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Took 0.35 seconds to deallocate network for instance. [ 1236.981836] env[68617]: INFO nova.scheduler.client.report [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Deleted allocations for instance 6eef6e24-cf49-458b-ae37-8da4e02045f8 [ 1237.003962] env[68617]: DEBUG oslo_concurrency.lockutils [None req-bdce121c-606c-4421-a711-6af0377b5d0b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "6eef6e24-cf49-458b-ae37-8da4e02045f8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 529.706s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1237.005072] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6bec6ec9-d9dc-4a85-a09e-ef0a836883fb tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "6eef6e24-cf49-458b-ae37-8da4e02045f8" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 328.677s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1237.005304] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6bec6ec9-d9dc-4a85-a09e-ef0a836883fb tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "6eef6e24-cf49-458b-ae37-8da4e02045f8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1237.005520] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6bec6ec9-d9dc-4a85-a09e-ef0a836883fb tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "6eef6e24-cf49-458b-ae37-8da4e02045f8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1237.005687] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6bec6ec9-d9dc-4a85-a09e-ef0a836883fb tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "6eef6e24-cf49-458b-ae37-8da4e02045f8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1237.007630] env[68617]: INFO nova.compute.manager [None req-6bec6ec9-d9dc-4a85-a09e-ef0a836883fb tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Terminating instance [ 1237.009881] env[68617]: DEBUG nova.compute.manager [None req-6bec6ec9-d9dc-4a85-a09e-ef0a836883fb tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1237.010179] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-6bec6ec9-d9dc-4a85-a09e-ef0a836883fb tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1237.010707] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5782c718-4b2b-400f-a562-c75e9c9fd8fb {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1237.020161] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79e494fb-f251-464c-8540-8010f104eb0c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1237.031051] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1237.050885] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-6bec6ec9-d9dc-4a85-a09e-ef0a836883fb tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 6eef6e24-cf49-458b-ae37-8da4e02045f8 could not be found. [ 1237.051094] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-6bec6ec9-d9dc-4a85-a09e-ef0a836883fb tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1237.051273] env[68617]: INFO nova.compute.manager [None req-6bec6ec9-d9dc-4a85-a09e-ef0a836883fb tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1237.051535] env[68617]: DEBUG oslo.service.loopingcall [None req-6bec6ec9-d9dc-4a85-a09e-ef0a836883fb tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1237.051806] env[68617]: DEBUG nova.compute.manager [-] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1237.051908] env[68617]: DEBUG nova.network.neutron [-] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1237.077241] env[68617]: DEBUG nova.network.neutron [-] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1237.080035] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1237.080273] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1237.081741] env[68617]: INFO nova.compute.claims [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1237.085201] env[68617]: INFO nova.compute.manager [-] [instance: 6eef6e24-cf49-458b-ae37-8da4e02045f8] Took 0.03 seconds to deallocate network for instance. [ 1237.206518] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6bec6ec9-d9dc-4a85-a09e-ef0a836883fb tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "6eef6e24-cf49-458b-ae37-8da4e02045f8" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.201s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1237.540013] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-302ad950-dde3-4745-a1ff-be01e1dcdc31 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1237.547117] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65f8bc0b-87af-4cdd-a31c-1249230074ac {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1237.577032] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91f3a201-1866-408a-9fb1-f67251d0f27d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1237.584173] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0934a526-cb44-4b43-b77d-9f76ce7adecf {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1237.596922] env[68617]: DEBUG nova.compute.provider_tree [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1237.605046] env[68617]: DEBUG nova.scheduler.client.report [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1237.621809] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.541s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1237.622298] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1237.660342] env[68617]: DEBUG nova.compute.utils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1237.661924] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1237.661924] env[68617]: DEBUG nova.network.neutron [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1237.670558] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1237.724194] env[68617]: DEBUG nova.policy [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6a52ac4298854c2481284a1d27a5e808', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2585c2453a3f41ac85950f43c05b7025', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 1237.737585] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1237.769281] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1237.769515] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1237.769672] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1237.769850] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1237.770017] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1237.770153] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1237.770370] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1237.770526] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1237.770690] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1237.770874] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1237.771088] env[68617]: DEBUG nova.virt.hardware [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1237.772306] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c00461a7-3f2d-4d07-8d14-1f71e84b0d94 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1237.780103] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eea52d81-5b11-43bb-9d07-216ad1845719 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1238.025691] env[68617]: DEBUG nova.network.neutron [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Successfully created port: 9562e05a-f31e-4b09-baa9-15c8703cd626 {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1238.622685] env[68617]: DEBUG nova.network.neutron [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Successfully updated port: 9562e05a-f31e-4b09-baa9-15c8703cd626 {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1238.637322] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquiring lock "refresh_cache-a8ff6232-530c-453a-96e4-f8ce00f976e3" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1238.637322] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquired lock "refresh_cache-a8ff6232-530c-453a-96e4-f8ce00f976e3" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1238.637322] env[68617]: DEBUG nova.network.neutron [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1238.674756] env[68617]: DEBUG nova.network.neutron [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1238.845728] env[68617]: DEBUG nova.network.neutron [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Updating instance_info_cache with network_info: [{"id": "9562e05a-f31e-4b09-baa9-15c8703cd626", "address": "fa:16:3e:e2:e3:42", "network": {"id": "62b5016e-8314-483a-87c5-3175a0b6a0eb", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1469879935-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2585c2453a3f41ac85950f43c05b7025", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2ed91b7b-b4ec-486d-ab34-af0afb7ec691", "external-id": "nsx-vlan-transportzone-75", "segmentation_id": 75, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9562e05a-f3", "ovs_interfaceid": "9562e05a-f31e-4b09-baa9-15c8703cd626", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1238.859058] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Releasing lock "refresh_cache-a8ff6232-530c-453a-96e4-f8ce00f976e3" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1238.859368] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Instance network_info: |[{"id": "9562e05a-f31e-4b09-baa9-15c8703cd626", "address": "fa:16:3e:e2:e3:42", "network": {"id": "62b5016e-8314-483a-87c5-3175a0b6a0eb", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1469879935-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2585c2453a3f41ac85950f43c05b7025", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2ed91b7b-b4ec-486d-ab34-af0afb7ec691", "external-id": "nsx-vlan-transportzone-75", "segmentation_id": 75, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9562e05a-f3", "ovs_interfaceid": "9562e05a-f31e-4b09-baa9-15c8703cd626", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1238.859781] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e2:e3:42', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2ed91b7b-b4ec-486d-ab34-af0afb7ec691', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '9562e05a-f31e-4b09-baa9-15c8703cd626', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1238.867563] env[68617]: DEBUG oslo.service.loopingcall [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1238.868048] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1238.868288] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-cbed1dad-e26e-42fe-b680-fe70af97b366 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1238.888134] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1238.888134] env[68617]: value = "task-3470798" [ 1238.888134] env[68617]: _type = "Task" [ 1238.888134] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1238.895707] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470798, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1238.912048] env[68617]: DEBUG nova.compute.manager [req-724e2e4b-019d-4d62-94c6-c6283566f195 req-ec573965-d613-4886-9bd7-55142e4ba141 service nova] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Received event network-vif-plugged-9562e05a-f31e-4b09-baa9-15c8703cd626 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1238.912323] env[68617]: DEBUG oslo_concurrency.lockutils [req-724e2e4b-019d-4d62-94c6-c6283566f195 req-ec573965-d613-4886-9bd7-55142e4ba141 service nova] Acquiring lock "a8ff6232-530c-453a-96e4-f8ce00f976e3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1238.912542] env[68617]: DEBUG oslo_concurrency.lockutils [req-724e2e4b-019d-4d62-94c6-c6283566f195 req-ec573965-d613-4886-9bd7-55142e4ba141 service nova] Lock "a8ff6232-530c-453a-96e4-f8ce00f976e3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1238.912612] env[68617]: DEBUG oslo_concurrency.lockutils [req-724e2e4b-019d-4d62-94c6-c6283566f195 req-ec573965-d613-4886-9bd7-55142e4ba141 service nova] Lock "a8ff6232-530c-453a-96e4-f8ce00f976e3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1238.912811] env[68617]: DEBUG nova.compute.manager [req-724e2e4b-019d-4d62-94c6-c6283566f195 req-ec573965-d613-4886-9bd7-55142e4ba141 service nova] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] No waiting events found dispatching network-vif-plugged-9562e05a-f31e-4b09-baa9-15c8703cd626 {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1238.912915] env[68617]: WARNING nova.compute.manager [req-724e2e4b-019d-4d62-94c6-c6283566f195 req-ec573965-d613-4886-9bd7-55142e4ba141 service nova] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Received unexpected event network-vif-plugged-9562e05a-f31e-4b09-baa9-15c8703cd626 for instance with vm_state building and task_state spawning. [ 1238.913065] env[68617]: DEBUG nova.compute.manager [req-724e2e4b-019d-4d62-94c6-c6283566f195 req-ec573965-d613-4886-9bd7-55142e4ba141 service nova] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Received event network-changed-9562e05a-f31e-4b09-baa9-15c8703cd626 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1238.913317] env[68617]: DEBUG nova.compute.manager [req-724e2e4b-019d-4d62-94c6-c6283566f195 req-ec573965-d613-4886-9bd7-55142e4ba141 service nova] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Refreshing instance network info cache due to event network-changed-9562e05a-f31e-4b09-baa9-15c8703cd626. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1238.913469] env[68617]: DEBUG oslo_concurrency.lockutils [req-724e2e4b-019d-4d62-94c6-c6283566f195 req-ec573965-d613-4886-9bd7-55142e4ba141 service nova] Acquiring lock "refresh_cache-a8ff6232-530c-453a-96e4-f8ce00f976e3" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1238.913607] env[68617]: DEBUG oslo_concurrency.lockutils [req-724e2e4b-019d-4d62-94c6-c6283566f195 req-ec573965-d613-4886-9bd7-55142e4ba141 service nova] Acquired lock "refresh_cache-a8ff6232-530c-453a-96e4-f8ce00f976e3" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1238.913761] env[68617]: DEBUG nova.network.neutron [req-724e2e4b-019d-4d62-94c6-c6283566f195 req-ec573965-d613-4886-9bd7-55142e4ba141 service nova] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Refreshing network info cache for port 9562e05a-f31e-4b09-baa9-15c8703cd626 {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1239.334968] env[68617]: DEBUG nova.network.neutron [req-724e2e4b-019d-4d62-94c6-c6283566f195 req-ec573965-d613-4886-9bd7-55142e4ba141 service nova] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Updated VIF entry in instance network info cache for port 9562e05a-f31e-4b09-baa9-15c8703cd626. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1239.335348] env[68617]: DEBUG nova.network.neutron [req-724e2e4b-019d-4d62-94c6-c6283566f195 req-ec573965-d613-4886-9bd7-55142e4ba141 service nova] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Updating instance_info_cache with network_info: [{"id": "9562e05a-f31e-4b09-baa9-15c8703cd626", "address": "fa:16:3e:e2:e3:42", "network": {"id": "62b5016e-8314-483a-87c5-3175a0b6a0eb", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1469879935-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2585c2453a3f41ac85950f43c05b7025", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2ed91b7b-b4ec-486d-ab34-af0afb7ec691", "external-id": "nsx-vlan-transportzone-75", "segmentation_id": 75, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9562e05a-f3", "ovs_interfaceid": "9562e05a-f31e-4b09-baa9-15c8703cd626", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1239.344878] env[68617]: DEBUG oslo_concurrency.lockutils [req-724e2e4b-019d-4d62-94c6-c6283566f195 req-ec573965-d613-4886-9bd7-55142e4ba141 service nova] Releasing lock "refresh_cache-a8ff6232-530c-453a-96e4-f8ce00f976e3" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1239.397855] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470798, 'name': CreateVM_Task, 'duration_secs': 0.324867} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1239.398053] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1239.399531] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1239.399630] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1239.399904] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1239.400191] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-045b9aa4-21cb-4fc7-b299-4abe9deb13fa {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1239.404773] env[68617]: DEBUG oslo_vmware.api [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Waiting for the task: (returnval){ [ 1239.404773] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52712664-b3d3-0641-e003-bc10a2f18b3e" [ 1239.404773] env[68617]: _type = "Task" [ 1239.404773] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1239.412663] env[68617]: DEBUG oslo_vmware.api [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52712664-b3d3-0641-e003-bc10a2f18b3e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1239.914724] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1239.915014] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1239.915242] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1241.699807] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1241.700080] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Cleaning up deleted instances with incomplete migration {{(pid=68617) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1243.709531] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1243.709531] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Cleaning up deleted instances {{(pid=68617) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1243.723533] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] There are 0 instances to clean {{(pid=68617) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1245.716230] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1245.716230] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Starting heal instance info cache {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1245.716230] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Rebuilding the list of instances to heal {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1245.741820] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1245.742015] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1245.743668] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1245.743805] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1245.743931] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1245.744067] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1245.744188] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1245.744305] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1245.744633] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1245.744633] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1245.744747] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Didn't find any instances for network info cache update. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1246.698743] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1246.734968] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1247.699572] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1248.698959] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1249.698873] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1250.706412] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1250.706755] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1251.699854] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1251.700060] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68617) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1252.695064] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1254.699205] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1254.713172] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1254.713172] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1254.713420] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1254.713690] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1254.717424] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afd7db42-3b3d-4067-a401-840142f6c1c6 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1254.728216] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96a4832f-006c-4507-bb11-6567ad372965 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1254.742738] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74f3ad76-f42b-4195-9859-2b82669e7c70 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1254.749667] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a37bfeaa-bcde-4714-bd74-fc763784a2a4 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1254.779882] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180923MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1254.780160] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1254.780286] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1254.962652] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 71b1ebba-2019-4378-9bd2-98a7559c22e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1254.962818] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance e6b6cbdd-11d6-44a6-8da7-98e0f52cef67 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1254.962945] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b27ace75-e2fa-4acc-96cb-88dd49b89de5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1254.963122] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 995585f5-57a4-4ba6-9e28-18a086af264c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1254.963300] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 82864ac3-a199-478c-8c57-97ea0a256201 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1254.963494] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1254.963621] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 79c92a1b-20ef-4360-93b4-913cbfcf92fe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1254.963738] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 1cc42c7f-8781-40b0-9f75-edfef3bc90e7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1254.963851] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance d46ca6f3-0ee9-412c-98b4-f639ce4f9228 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1254.963965] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance a8ff6232-530c-453a-96e4-f8ce00f976e3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1254.975556] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1254.988536] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 030eceb1-51a5-4e34-ad67-727b7ebd524f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1254.998235] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 07927d19-2354-4215-b89d-5920e20b8222 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1255.009480] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 59df690b-bfbb-4976-b80b-60106c53ba25 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1255.019268] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 98b47fc9-678d-4c60-b9e5-78423719ae76 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1255.029889] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance e90877a8-47d3-47d7-8362-5bcfe3a98c36 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1255.039752] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance c5764a1d-3370-4756-ada0-03b503368d17 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1255.049638] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance c0528a20-34cb-4b51-bb4c-8c3828021a85 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1255.061471] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance fa9b2716-783b-4b19-bfc9-aad609c3a659 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1255.071075] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance dd611e75-aac1-4cdb-b263-6956d6254743 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1255.081366] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 075eb6cb-a53b-44d9-986d-bc85d4b8ac25 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1255.091533] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 65014c6f-8b4e-4468-9462-4b8cdc08af73 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1255.101414] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 7e1c7e8a-139e-4e8a-a3e1-39a2d7c3fc47 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1255.112767] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 2bffd2c4-f290-4df6-b7b6-6dd963befdab has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1255.122520] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 13d6e00b-3c18-4346-b229-b56bdaba2dc8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1255.132385] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f03b9bc5-9438-4c0c-b595-72c631bece08 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1255.142322] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 570302ee-2383-4659-80e1-af4b16d03a21 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1255.152220] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 7bf75617-fcd8-4d96-bf02-ddb723e8ad96 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1255.152462] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1255.152608] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1856MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1255.168214] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Refreshing inventories for resource provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1255.182052] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Updating ProviderTree inventory for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1255.182394] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Updating inventory in ProviderTree for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1255.193044] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Refreshing aggregate associations for resource provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f, aggregates: None {{(pid=68617) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1255.210073] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Refreshing trait associations for resource provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_VMDK {{(pid=68617) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1255.503446] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-603cda09-9f26-4371-bd9a-8d31d77053d8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1255.512144] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d40f6028-0a4b-4df3-a176-ca749027a0a8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1255.543091] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3cde4711-430b-4f6d-b98f-8647938042f4 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1255.551255] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f85c5de7-6769-4634-b391-e03d64525618 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1255.564490] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1255.573803] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1255.588292] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1255.588484] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.808s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1282.331560] env[68617]: WARNING oslo_vmware.rw_handles [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1282.331560] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1282.331560] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1282.331560] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1282.331560] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1282.331560] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 1282.331560] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1282.331560] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1282.331560] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1282.331560] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1282.331560] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1282.331560] env[68617]: ERROR oslo_vmware.rw_handles [ 1282.332314] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/789e0af7-649f-4a0a-9a13-8dd5797fc88f/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1282.333902] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1282.334160] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Copying Virtual Disk [datastore2] vmware_temp/789e0af7-649f-4a0a-9a13-8dd5797fc88f/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/789e0af7-649f-4a0a-9a13-8dd5797fc88f/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1282.334494] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-0884e33d-35a2-4261-95c7-ac9c4d844f4b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1282.342339] env[68617]: DEBUG oslo_vmware.api [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Waiting for the task: (returnval){ [ 1282.342339] env[68617]: value = "task-3470799" [ 1282.342339] env[68617]: _type = "Task" [ 1282.342339] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1282.351281] env[68617]: DEBUG oslo_vmware.api [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Task: {'id': task-3470799, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1282.854073] env[68617]: DEBUG oslo_vmware.exceptions [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1282.854073] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1282.854406] env[68617]: ERROR nova.compute.manager [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1282.854406] env[68617]: Faults: ['InvalidArgument'] [ 1282.854406] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Traceback (most recent call last): [ 1282.854406] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1282.854406] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] yield resources [ 1282.854406] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1282.854406] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] self.driver.spawn(context, instance, image_meta, [ 1282.854406] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1282.854406] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1282.854406] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1282.854406] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] self._fetch_image_if_missing(context, vi) [ 1282.854406] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1282.854795] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] image_cache(vi, tmp_image_ds_loc) [ 1282.854795] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1282.854795] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] vm_util.copy_virtual_disk( [ 1282.854795] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1282.854795] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] session._wait_for_task(vmdk_copy_task) [ 1282.854795] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1282.854795] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] return self.wait_for_task(task_ref) [ 1282.854795] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1282.854795] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] return evt.wait() [ 1282.854795] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1282.854795] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] result = hub.switch() [ 1282.854795] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1282.854795] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] return self.greenlet.switch() [ 1282.855218] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1282.855218] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] self.f(*self.args, **self.kw) [ 1282.855218] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1282.855218] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] raise exceptions.translate_fault(task_info.error) [ 1282.855218] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1282.855218] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Faults: ['InvalidArgument'] [ 1282.855218] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] [ 1282.855218] env[68617]: INFO nova.compute.manager [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Terminating instance [ 1282.856388] env[68617]: DEBUG oslo_concurrency.lockutils [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1282.856603] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1282.856836] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ee8e327d-c7f1-4108-abf8-7c857e22478a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1282.860216] env[68617]: DEBUG nova.compute.manager [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1282.860408] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1282.861137] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd6fa5ce-1149-4257-a0b9-0584772e8b58 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1282.864690] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1282.864861] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1282.865820] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0a9fc299-e85d-4be2-b834-717d0445e307 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1282.869779] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1282.870261] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-485ce520-bc98-4b8e-bbfa-06dd87b038fb {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1282.872573] env[68617]: DEBUG oslo_vmware.api [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Waiting for the task: (returnval){ [ 1282.872573] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]525ad44b-c268-c144-6562-a0f5accadd14" [ 1282.872573] env[68617]: _type = "Task" [ 1282.872573] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1282.886686] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1282.886912] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Creating directory with path [datastore2] vmware_temp/1965b498-1581-4e7f-b211-87ec98203c06/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1282.887181] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c83fb287-c22b-435e-ae05-6f6f1236b5b9 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1282.905947] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Created directory with path [datastore2] vmware_temp/1965b498-1581-4e7f-b211-87ec98203c06/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1282.906161] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Fetch image to [datastore2] vmware_temp/1965b498-1581-4e7f-b211-87ec98203c06/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1282.906336] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/1965b498-1581-4e7f-b211-87ec98203c06/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1282.907063] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-376d00b8-78a9-421b-a97b-38fb07d66e81 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1282.913707] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b413fe9d-623d-458a-810e-ec512ad88e1d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1282.922642] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d69291d1-8051-4ace-9591-99a142e66c92 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1282.955402] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5b61b49-107a-4fea-a1bc-5d83323ea376 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1282.958013] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1282.958217] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1282.958385] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Deleting the datastore file [datastore2] 71b1ebba-2019-4378-9bd2-98a7559c22e8 {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1282.958613] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-51640ac4-e995-4b97-82a3-1f810e77fa8c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1282.963293] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7b8225e2-0cb6-4ac7-b0bc-5ef33281477f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1282.965952] env[68617]: DEBUG oslo_vmware.api [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Waiting for the task: (returnval){ [ 1282.965952] env[68617]: value = "task-3470801" [ 1282.965952] env[68617]: _type = "Task" [ 1282.965952] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1282.973026] env[68617]: DEBUG oslo_vmware.api [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Task: {'id': task-3470801, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1282.992793] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1283.047073] env[68617]: DEBUG oslo_vmware.rw_handles [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1965b498-1581-4e7f-b211-87ec98203c06/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1283.106629] env[68617]: DEBUG oslo_vmware.rw_handles [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1283.106840] env[68617]: DEBUG oslo_vmware.rw_handles [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1965b498-1581-4e7f-b211-87ec98203c06/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1283.477014] env[68617]: DEBUG oslo_vmware.api [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Task: {'id': task-3470801, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.06866} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1283.477415] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1283.477541] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1283.477710] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1283.477880] env[68617]: INFO nova.compute.manager [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1283.480174] env[68617]: DEBUG nova.compute.claims [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1283.480343] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1283.480569] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1283.835554] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f96cb95-374f-4e51-8ec6-b9658e39b8d0 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1283.844051] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00b3d6b4-f3ff-4547-847a-7ef2c245a443 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1283.874721] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f03bae5-5d96-4c94-8d98-b0be5a2d9226 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1283.882292] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bedaebd-2d1c-488d-9a17-62afa249ff51 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1283.895332] env[68617]: DEBUG nova.compute.provider_tree [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1283.904815] env[68617]: DEBUG nova.scheduler.client.report [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1283.920257] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.440s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1283.920758] env[68617]: ERROR nova.compute.manager [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1283.920758] env[68617]: Faults: ['InvalidArgument'] [ 1283.920758] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Traceback (most recent call last): [ 1283.920758] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1283.920758] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] self.driver.spawn(context, instance, image_meta, [ 1283.920758] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1283.920758] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1283.920758] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1283.920758] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] self._fetch_image_if_missing(context, vi) [ 1283.920758] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1283.920758] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] image_cache(vi, tmp_image_ds_loc) [ 1283.920758] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1283.921235] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] vm_util.copy_virtual_disk( [ 1283.921235] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1283.921235] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] session._wait_for_task(vmdk_copy_task) [ 1283.921235] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1283.921235] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] return self.wait_for_task(task_ref) [ 1283.921235] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1283.921235] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] return evt.wait() [ 1283.921235] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1283.921235] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] result = hub.switch() [ 1283.921235] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1283.921235] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] return self.greenlet.switch() [ 1283.921235] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1283.921235] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] self.f(*self.args, **self.kw) [ 1283.921633] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1283.921633] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] raise exceptions.translate_fault(task_info.error) [ 1283.921633] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1283.921633] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Faults: ['InvalidArgument'] [ 1283.921633] env[68617]: ERROR nova.compute.manager [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] [ 1283.921633] env[68617]: DEBUG nova.compute.utils [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1283.922819] env[68617]: DEBUG nova.compute.manager [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Build of instance 71b1ebba-2019-4378-9bd2-98a7559c22e8 was re-scheduled: A specified parameter was not correct: fileType [ 1283.922819] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1283.923227] env[68617]: DEBUG nova.compute.manager [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1283.923398] env[68617]: DEBUG nova.compute.manager [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1283.923564] env[68617]: DEBUG nova.compute.manager [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1283.923727] env[68617]: DEBUG nova.network.neutron [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1284.237967] env[68617]: DEBUG nova.network.neutron [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1284.247781] env[68617]: INFO nova.compute.manager [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Took 0.32 seconds to deallocate network for instance. [ 1284.339282] env[68617]: INFO nova.scheduler.client.report [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Deleted allocations for instance 71b1ebba-2019-4378-9bd2-98a7559c22e8 [ 1284.373334] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0974c327-2775-4b4e-8356-bd872096e848 tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Lock "71b1ebba-2019-4378-9bd2-98a7559c22e8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 573.766s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1284.373610] env[68617]: DEBUG oslo_concurrency.lockutils [None req-22536537-b16f-43a6-9dc4-cf2c2d6e35ed tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Lock "71b1ebba-2019-4378-9bd2-98a7559c22e8" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 373.785s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1284.373831] env[68617]: DEBUG oslo_concurrency.lockutils [None req-22536537-b16f-43a6-9dc4-cf2c2d6e35ed tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Acquiring lock "71b1ebba-2019-4378-9bd2-98a7559c22e8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1284.374043] env[68617]: DEBUG oslo_concurrency.lockutils [None req-22536537-b16f-43a6-9dc4-cf2c2d6e35ed tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Lock "71b1ebba-2019-4378-9bd2-98a7559c22e8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1284.374211] env[68617]: DEBUG oslo_concurrency.lockutils [None req-22536537-b16f-43a6-9dc4-cf2c2d6e35ed tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Lock "71b1ebba-2019-4378-9bd2-98a7559c22e8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1284.376208] env[68617]: INFO nova.compute.manager [None req-22536537-b16f-43a6-9dc4-cf2c2d6e35ed tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Terminating instance [ 1284.377887] env[68617]: DEBUG nova.compute.manager [None req-22536537-b16f-43a6-9dc4-cf2c2d6e35ed tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1284.378115] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-22536537-b16f-43a6-9dc4-cf2c2d6e35ed tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1284.378579] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-0f27aab0-9ff5-4afb-80c3-1958446f0d5e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1284.387497] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba8cc0ea-406b-4089-963b-4b2c9f988821 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1284.398664] env[68617]: DEBUG nova.compute.manager [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1284.419553] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-22536537-b16f-43a6-9dc4-cf2c2d6e35ed tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 71b1ebba-2019-4378-9bd2-98a7559c22e8 could not be found. [ 1284.419553] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-22536537-b16f-43a6-9dc4-cf2c2d6e35ed tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1284.419681] env[68617]: INFO nova.compute.manager [None req-22536537-b16f-43a6-9dc4-cf2c2d6e35ed tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1284.419903] env[68617]: DEBUG oslo.service.loopingcall [None req-22536537-b16f-43a6-9dc4-cf2c2d6e35ed tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1284.420150] env[68617]: DEBUG nova.compute.manager [-] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1284.420249] env[68617]: DEBUG nova.network.neutron [-] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1284.445638] env[68617]: DEBUG nova.network.neutron [-] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1284.452197] env[68617]: DEBUG oslo_concurrency.lockutils [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1284.452492] env[68617]: DEBUG oslo_concurrency.lockutils [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1284.454007] env[68617]: INFO nova.compute.claims [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1284.457309] env[68617]: INFO nova.compute.manager [-] [instance: 71b1ebba-2019-4378-9bd2-98a7559c22e8] Took 0.04 seconds to deallocate network for instance. [ 1284.546055] env[68617]: DEBUG oslo_concurrency.lockutils [None req-22536537-b16f-43a6-9dc4-cf2c2d6e35ed tempest-FloatingIPsAssociationTestJSON-542062800 tempest-FloatingIPsAssociationTestJSON-542062800-project-member] Lock "71b1ebba-2019-4378-9bd2-98a7559c22e8" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.172s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1284.804727] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2048c8b-7ddc-4acd-b06e-3561a2ef2033 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1284.812992] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a7875dd-fb2b-4103-9736-32f9212c57a4 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1284.843052] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6195e7cc-35c6-483f-b22d-3291038c71fc {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1284.850097] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7075b167-e08d-4a09-b55c-8131cf315f68 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1284.863288] env[68617]: DEBUG nova.compute.provider_tree [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1284.871882] env[68617]: DEBUG nova.scheduler.client.report [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1284.884865] env[68617]: DEBUG oslo_concurrency.lockutils [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.432s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1284.885346] env[68617]: DEBUG nova.compute.manager [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1284.918288] env[68617]: DEBUG nova.compute.utils [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1284.919629] env[68617]: DEBUG nova.compute.manager [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1284.919808] env[68617]: DEBUG nova.network.neutron [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1284.929863] env[68617]: DEBUG nova.compute.manager [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1284.986623] env[68617]: DEBUG nova.policy [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b8fbe375e171403ea40986649eb7489a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a98906f3d98e49469a37662764665f78', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 1284.992976] env[68617]: DEBUG nova.compute.manager [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1285.018132] env[68617]: DEBUG nova.virt.hardware [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1285.018369] env[68617]: DEBUG nova.virt.hardware [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1285.018523] env[68617]: DEBUG nova.virt.hardware [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1285.018703] env[68617]: DEBUG nova.virt.hardware [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1285.018845] env[68617]: DEBUG nova.virt.hardware [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1285.018988] env[68617]: DEBUG nova.virt.hardware [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1285.019211] env[68617]: DEBUG nova.virt.hardware [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1285.019365] env[68617]: DEBUG nova.virt.hardware [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1285.019702] env[68617]: DEBUG nova.virt.hardware [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1285.019702] env[68617]: DEBUG nova.virt.hardware [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1285.019847] env[68617]: DEBUG nova.virt.hardware [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1285.020846] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-690b8578-2435-43a3-874b-c9ea747b27a1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1285.028259] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87fc7ce1-5a3b-4198-a859-23454e4ee80a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1285.281881] env[68617]: DEBUG nova.network.neutron [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Successfully created port: ae748315-dd38-4c4e-a21e-0b9714f251dd {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1285.869856] env[68617]: DEBUG nova.network.neutron [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Successfully updated port: ae748315-dd38-4c4e-a21e-0b9714f251dd {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1285.884229] env[68617]: DEBUG oslo_concurrency.lockutils [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Acquiring lock "refresh_cache-5f31aef1-4806-48e1-9d5a-5dff09ea0f0d" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1285.884399] env[68617]: DEBUG oslo_concurrency.lockutils [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Acquired lock "refresh_cache-5f31aef1-4806-48e1-9d5a-5dff09ea0f0d" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1285.884527] env[68617]: DEBUG nova.network.neutron [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1285.920776] env[68617]: DEBUG nova.network.neutron [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1286.074400] env[68617]: DEBUG nova.network.neutron [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Updating instance_info_cache with network_info: [{"id": "ae748315-dd38-4c4e-a21e-0b9714f251dd", "address": "fa:16:3e:7c:f2:78", "network": {"id": "10b8a8fa-b1d0-4377-9319-bba7f6c17880", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1715835007-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a98906f3d98e49469a37662764665f78", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ca83c3bc-f3ec-42ab-85b3-192512f766f3", "external-id": "nsx-vlan-transportzone-879", "segmentation_id": 879, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapae748315-dd", "ovs_interfaceid": "ae748315-dd38-4c4e-a21e-0b9714f251dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1286.086245] env[68617]: DEBUG oslo_concurrency.lockutils [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Releasing lock "refresh_cache-5f31aef1-4806-48e1-9d5a-5dff09ea0f0d" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1286.086521] env[68617]: DEBUG nova.compute.manager [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Instance network_info: |[{"id": "ae748315-dd38-4c4e-a21e-0b9714f251dd", "address": "fa:16:3e:7c:f2:78", "network": {"id": "10b8a8fa-b1d0-4377-9319-bba7f6c17880", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1715835007-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a98906f3d98e49469a37662764665f78", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ca83c3bc-f3ec-42ab-85b3-192512f766f3", "external-id": "nsx-vlan-transportzone-879", "segmentation_id": 879, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapae748315-dd", "ovs_interfaceid": "ae748315-dd38-4c4e-a21e-0b9714f251dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1286.086900] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:7c:f2:78', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ca83c3bc-f3ec-42ab-85b3-192512f766f3', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ae748315-dd38-4c4e-a21e-0b9714f251dd', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1286.094836] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Creating folder: Project (a98906f3d98e49469a37662764665f78). Parent ref: group-v693691. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1286.095355] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c9bdacdc-4b3a-4d3e-a74a-f9f95670146c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1286.105657] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Created folder: Project (a98906f3d98e49469a37662764665f78) in parent group-v693691. [ 1286.105842] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Creating folder: Instances. Parent ref: group-v693758. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1286.106086] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c91f7964-4ead-40c7-ae06-fc7c66c93d83 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1286.115298] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Created folder: Instances in parent group-v693758. [ 1286.115510] env[68617]: DEBUG oslo.service.loopingcall [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1286.115681] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1286.115859] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1cb5c977-409c-4e86-8857-d5789989fa97 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1286.134740] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1286.134740] env[68617]: value = "task-3470804" [ 1286.134740] env[68617]: _type = "Task" [ 1286.134740] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1286.143739] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470804, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1286.302977] env[68617]: DEBUG nova.compute.manager [req-ba6c68ba-f557-404a-841b-9e442723670e req-e8ae6e63-7d55-4ff0-84c7-4a5fc99512c0 service nova] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Received event network-vif-plugged-ae748315-dd38-4c4e-a21e-0b9714f251dd {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1286.303224] env[68617]: DEBUG oslo_concurrency.lockutils [req-ba6c68ba-f557-404a-841b-9e442723670e req-e8ae6e63-7d55-4ff0-84c7-4a5fc99512c0 service nova] Acquiring lock "5f31aef1-4806-48e1-9d5a-5dff09ea0f0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1286.303434] env[68617]: DEBUG oslo_concurrency.lockutils [req-ba6c68ba-f557-404a-841b-9e442723670e req-e8ae6e63-7d55-4ff0-84c7-4a5fc99512c0 service nova] Lock "5f31aef1-4806-48e1-9d5a-5dff09ea0f0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1286.303602] env[68617]: DEBUG oslo_concurrency.lockutils [req-ba6c68ba-f557-404a-841b-9e442723670e req-e8ae6e63-7d55-4ff0-84c7-4a5fc99512c0 service nova] Lock "5f31aef1-4806-48e1-9d5a-5dff09ea0f0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1286.303778] env[68617]: DEBUG nova.compute.manager [req-ba6c68ba-f557-404a-841b-9e442723670e req-e8ae6e63-7d55-4ff0-84c7-4a5fc99512c0 service nova] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] No waiting events found dispatching network-vif-plugged-ae748315-dd38-4c4e-a21e-0b9714f251dd {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1286.303962] env[68617]: WARNING nova.compute.manager [req-ba6c68ba-f557-404a-841b-9e442723670e req-e8ae6e63-7d55-4ff0-84c7-4a5fc99512c0 service nova] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Received unexpected event network-vif-plugged-ae748315-dd38-4c4e-a21e-0b9714f251dd for instance with vm_state building and task_state spawning. [ 1286.304139] env[68617]: DEBUG nova.compute.manager [req-ba6c68ba-f557-404a-841b-9e442723670e req-e8ae6e63-7d55-4ff0-84c7-4a5fc99512c0 service nova] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Received event network-changed-ae748315-dd38-4c4e-a21e-0b9714f251dd {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1286.304299] env[68617]: DEBUG nova.compute.manager [req-ba6c68ba-f557-404a-841b-9e442723670e req-e8ae6e63-7d55-4ff0-84c7-4a5fc99512c0 service nova] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Refreshing instance network info cache due to event network-changed-ae748315-dd38-4c4e-a21e-0b9714f251dd. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1286.304478] env[68617]: DEBUG oslo_concurrency.lockutils [req-ba6c68ba-f557-404a-841b-9e442723670e req-e8ae6e63-7d55-4ff0-84c7-4a5fc99512c0 service nova] Acquiring lock "refresh_cache-5f31aef1-4806-48e1-9d5a-5dff09ea0f0d" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1286.304612] env[68617]: DEBUG oslo_concurrency.lockutils [req-ba6c68ba-f557-404a-841b-9e442723670e req-e8ae6e63-7d55-4ff0-84c7-4a5fc99512c0 service nova] Acquired lock "refresh_cache-5f31aef1-4806-48e1-9d5a-5dff09ea0f0d" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1286.304826] env[68617]: DEBUG nova.network.neutron [req-ba6c68ba-f557-404a-841b-9e442723670e req-e8ae6e63-7d55-4ff0-84c7-4a5fc99512c0 service nova] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Refreshing network info cache for port ae748315-dd38-4c4e-a21e-0b9714f251dd {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1286.644818] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470804, 'name': CreateVM_Task, 'duration_secs': 0.311441} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1286.645012] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1286.645711] env[68617]: DEBUG oslo_concurrency.lockutils [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1286.645877] env[68617]: DEBUG oslo_concurrency.lockutils [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1286.646216] env[68617]: DEBUG oslo_concurrency.lockutils [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1286.646471] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-092a680f-8dfd-4cbc-9efb-da96939a8fff {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1286.651019] env[68617]: DEBUG oslo_vmware.api [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Waiting for the task: (returnval){ [ 1286.651019] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52c54c6f-d35a-b991-4c14-9d5bb82ce57f" [ 1286.651019] env[68617]: _type = "Task" [ 1286.651019] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1286.659591] env[68617]: DEBUG oslo_vmware.api [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52c54c6f-d35a-b991-4c14-9d5bb82ce57f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1286.748841] env[68617]: DEBUG nova.network.neutron [req-ba6c68ba-f557-404a-841b-9e442723670e req-e8ae6e63-7d55-4ff0-84c7-4a5fc99512c0 service nova] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Updated VIF entry in instance network info cache for port ae748315-dd38-4c4e-a21e-0b9714f251dd. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1286.749235] env[68617]: DEBUG nova.network.neutron [req-ba6c68ba-f557-404a-841b-9e442723670e req-e8ae6e63-7d55-4ff0-84c7-4a5fc99512c0 service nova] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Updating instance_info_cache with network_info: [{"id": "ae748315-dd38-4c4e-a21e-0b9714f251dd", "address": "fa:16:3e:7c:f2:78", "network": {"id": "10b8a8fa-b1d0-4377-9319-bba7f6c17880", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1715835007-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a98906f3d98e49469a37662764665f78", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ca83c3bc-f3ec-42ab-85b3-192512f766f3", "external-id": "nsx-vlan-transportzone-879", "segmentation_id": 879, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapae748315-dd", "ovs_interfaceid": "ae748315-dd38-4c4e-a21e-0b9714f251dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1286.760426] env[68617]: DEBUG oslo_concurrency.lockutils [req-ba6c68ba-f557-404a-841b-9e442723670e req-e8ae6e63-7d55-4ff0-84c7-4a5fc99512c0 service nova] Releasing lock "refresh_cache-5f31aef1-4806-48e1-9d5a-5dff09ea0f0d" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1287.161068] env[68617]: DEBUG oslo_concurrency.lockutils [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1287.161414] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1287.161538] env[68617]: DEBUG oslo_concurrency.lockutils [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1289.496783] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._sync_power_states {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1289.519303] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Getting list of instances from cluster (obj){ [ 1289.519303] env[68617]: value = "domain-c8" [ 1289.519303] env[68617]: _type = "ClusterComputeResource" [ 1289.519303] env[68617]: } {{(pid=68617) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1289.520689] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5381478-0627-4fe7-b23c-13bd60af793f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1289.538308] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Got total of 10 instances {{(pid=68617) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1289.538477] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Triggering sync for uuid e6b6cbdd-11d6-44a6-8da7-98e0f52cef67 {{(pid=68617) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1289.538653] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Triggering sync for uuid b27ace75-e2fa-4acc-96cb-88dd49b89de5 {{(pid=68617) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1289.538807] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Triggering sync for uuid 995585f5-57a4-4ba6-9e28-18a086af264c {{(pid=68617) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1289.538957] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Triggering sync for uuid 82864ac3-a199-478c-8c57-97ea0a256201 {{(pid=68617) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1289.539122] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Triggering sync for uuid dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908 {{(pid=68617) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1289.539272] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Triggering sync for uuid 79c92a1b-20ef-4360-93b4-913cbfcf92fe {{(pid=68617) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1289.539424] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Triggering sync for uuid 1cc42c7f-8781-40b0-9f75-edfef3bc90e7 {{(pid=68617) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1289.539581] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Triggering sync for uuid d46ca6f3-0ee9-412c-98b4-f639ce4f9228 {{(pid=68617) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1289.539736] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Triggering sync for uuid a8ff6232-530c-453a-96e4-f8ce00f976e3 {{(pid=68617) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1289.539884] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Triggering sync for uuid 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d {{(pid=68617) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1289.540195] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "e6b6cbdd-11d6-44a6-8da7-98e0f52cef67" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1289.540420] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "b27ace75-e2fa-4acc-96cb-88dd49b89de5" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1289.540616] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "995585f5-57a4-4ba6-9e28-18a086af264c" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1289.540872] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "82864ac3-a199-478c-8c57-97ea0a256201" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1289.541078] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1289.541271] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "79c92a1b-20ef-4360-93b4-913cbfcf92fe" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1289.541457] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "1cc42c7f-8781-40b0-9f75-edfef3bc90e7" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1289.541685] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "d46ca6f3-0ee9-412c-98b4-f639ce4f9228" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1289.541879] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "a8ff6232-530c-453a-96e4-f8ce00f976e3" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1289.542084] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "5f31aef1-4806-48e1-9d5a-5dff09ea0f0d" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1293.963532] env[68617]: DEBUG oslo_concurrency.lockutils [None req-b1ed08e4-6131-4183-bd2d-81d6cbb25b53 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquiring lock "a8ff6232-530c-453a-96e4-f8ce00f976e3" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1294.025680] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9d26d2ca-c76b-4b35-8148-3e3abf9c0836 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquiring lock "d46ca6f3-0ee9-412c-98b4-f639ce4f9228" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1294.089612] env[68617]: DEBUG oslo_concurrency.lockutils [None req-80ebfc67-2231-4d45-83cd-8491204e5755 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquiring lock "1cc42c7f-8781-40b0-9f75-edfef3bc90e7" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1299.699207] env[68617]: DEBUG oslo_concurrency.lockutils [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Acquiring lock "ee6efd93-25be-4268-afe9-ba39e543a4fb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1299.699724] env[68617]: DEBUG oslo_concurrency.lockutils [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Lock "ee6efd93-25be-4268-afe9-ba39e543a4fb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1301.559249] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ea1854ca-db59-4b97-97e1-e32424d8ed6a tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Acquiring lock "5f31aef1-4806-48e1-9d5a-5dff09ea0f0d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1302.216238] env[68617]: DEBUG oslo_concurrency.lockutils [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "1605028f-5d6d-4ac4-8416-c0465982c53a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1302.216488] env[68617]: DEBUG oslo_concurrency.lockutils [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "1605028f-5d6d-4ac4-8416-c0465982c53a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1306.744419] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1306.744717] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Starting heal instance info cache {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1306.744756] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Rebuilding the list of instances to heal {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1306.767146] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1306.767309] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1306.767439] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1306.767564] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1306.767684] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1306.767805] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1306.767922] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1306.768056] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1306.768179] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1306.768295] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1306.768412] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Didn't find any instances for network info cache update. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1308.698610] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1308.698882] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1310.699371] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1310.699670] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1311.699570] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1311.699850] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1311.700214] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68617) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1314.697672] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1314.697922] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1314.710386] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1314.710634] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1314.710803] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1314.710959] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1314.712137] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81e1ddf7-375e-42ee-8d76-427e5a722be3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1314.721524] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71411314-a52a-49dd-b8f6-80e712b5cd5d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1314.736317] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-456f648f-bb2a-44ec-9081-ced7914af66a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1314.743241] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-141c965b-4b5b-40c2-af7d-3c9ec2176d18 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1314.773034] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180936MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1314.773211] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1314.773421] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1314.858408] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance e6b6cbdd-11d6-44a6-8da7-98e0f52cef67 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1314.858540] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b27ace75-e2fa-4acc-96cb-88dd49b89de5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1314.858707] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 995585f5-57a4-4ba6-9e28-18a086af264c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1314.858791] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 82864ac3-a199-478c-8c57-97ea0a256201 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1314.858916] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1314.859049] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 79c92a1b-20ef-4360-93b4-913cbfcf92fe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1314.859170] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 1cc42c7f-8781-40b0-9f75-edfef3bc90e7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1314.859286] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance d46ca6f3-0ee9-412c-98b4-f639ce4f9228 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1314.859610] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance a8ff6232-530c-453a-96e4-f8ce00f976e3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1314.859810] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1314.872544] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 59df690b-bfbb-4976-b80b-60106c53ba25 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1314.884409] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 98b47fc9-678d-4c60-b9e5-78423719ae76 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1314.895645] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance e90877a8-47d3-47d7-8362-5bcfe3a98c36 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1314.907346] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance c5764a1d-3370-4756-ada0-03b503368d17 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1314.918430] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance c0528a20-34cb-4b51-bb4c-8c3828021a85 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1314.930091] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance fa9b2716-783b-4b19-bfc9-aad609c3a659 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1314.941191] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance dd611e75-aac1-4cdb-b263-6956d6254743 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1314.953503] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 075eb6cb-a53b-44d9-986d-bc85d4b8ac25 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1314.965499] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 65014c6f-8b4e-4468-9462-4b8cdc08af73 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1314.978818] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 7e1c7e8a-139e-4e8a-a3e1-39a2d7c3fc47 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1314.990878] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 2bffd2c4-f290-4df6-b7b6-6dd963befdab has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1315.003650] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 13d6e00b-3c18-4346-b229-b56bdaba2dc8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1315.015146] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f03b9bc5-9438-4c0c-b595-72c631bece08 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1315.027568] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 570302ee-2383-4659-80e1-af4b16d03a21 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1315.039363] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 7bf75617-fcd8-4d96-bf02-ddb723e8ad96 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1315.050312] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance ee6efd93-25be-4268-afe9-ba39e543a4fb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1315.060660] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 1605028f-5d6d-4ac4-8416-c0465982c53a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1315.060911] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1315.061075] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1856MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1315.399935] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93e45886-5483-46c3-bde2-67ff1c5eaca3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1315.407842] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91e386bd-b97c-4535-a22e-ab5a1020c9aa {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1315.439347] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26d1dd0b-9af2-4d35-aebc-7df84be092e9 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1315.446493] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b9e67e6-b1cc-4b9f-8611-9e02aa3c5cd6 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1315.460052] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1315.469600] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1315.486136] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1315.486330] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.713s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1325.845069] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Acquiring lock "fc1043b8-535d-4af0-b92b-1f43580cdc9a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1325.845370] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Lock "fc1043b8-535d-4af0-b92b-1f43580cdc9a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1332.346056] env[68617]: WARNING oslo_vmware.rw_handles [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1332.346056] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1332.346056] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1332.346056] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1332.346056] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1332.346056] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 1332.346056] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1332.346056] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1332.346056] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1332.346056] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1332.346056] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1332.346056] env[68617]: ERROR oslo_vmware.rw_handles [ 1332.346957] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/1965b498-1581-4e7f-b211-87ec98203c06/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1332.348464] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1332.348720] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Copying Virtual Disk [datastore2] vmware_temp/1965b498-1581-4e7f-b211-87ec98203c06/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/1965b498-1581-4e7f-b211-87ec98203c06/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1332.349010] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-38251fe8-c42c-4d8b-943d-5db4121a32ee {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1332.357938] env[68617]: DEBUG oslo_vmware.api [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Waiting for the task: (returnval){ [ 1332.357938] env[68617]: value = "task-3470805" [ 1332.357938] env[68617]: _type = "Task" [ 1332.357938] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1332.365601] env[68617]: DEBUG oslo_vmware.api [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Task: {'id': task-3470805, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1332.868734] env[68617]: DEBUG oslo_vmware.exceptions [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1332.869983] env[68617]: DEBUG oslo_concurrency.lockutils [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1332.869983] env[68617]: ERROR nova.compute.manager [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1332.869983] env[68617]: Faults: ['InvalidArgument'] [ 1332.869983] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Traceback (most recent call last): [ 1332.869983] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1332.869983] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] yield resources [ 1332.869983] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1332.869983] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] self.driver.spawn(context, instance, image_meta, [ 1332.869983] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1332.869983] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1332.870456] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1332.870456] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] self._fetch_image_if_missing(context, vi) [ 1332.870456] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1332.870456] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] image_cache(vi, tmp_image_ds_loc) [ 1332.870456] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1332.870456] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] vm_util.copy_virtual_disk( [ 1332.870456] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1332.870456] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] session._wait_for_task(vmdk_copy_task) [ 1332.870456] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1332.870456] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] return self.wait_for_task(task_ref) [ 1332.870456] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1332.870456] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] return evt.wait() [ 1332.870456] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1332.870954] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] result = hub.switch() [ 1332.870954] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1332.870954] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] return self.greenlet.switch() [ 1332.870954] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1332.870954] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] self.f(*self.args, **self.kw) [ 1332.870954] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1332.870954] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] raise exceptions.translate_fault(task_info.error) [ 1332.870954] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1332.870954] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Faults: ['InvalidArgument'] [ 1332.870954] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] [ 1332.870954] env[68617]: INFO nova.compute.manager [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Terminating instance [ 1332.871718] env[68617]: DEBUG oslo_concurrency.lockutils [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1332.872682] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1332.873337] env[68617]: DEBUG nova.compute.manager [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1332.873526] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1332.873756] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-41a815e1-f9b7-44ee-9e98-9f5c1a767e5d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1332.876064] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80e7a760-ab82-4685-aee9-56dd5698321b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1332.883484] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1332.884485] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-80380ff3-fa14-45b9-a050-57931ba8d7a1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1332.885891] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1332.886075] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1332.886718] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3c378cbf-11ed-4d8b-96d0-3f66c9fc91e1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1332.891456] env[68617]: DEBUG oslo_vmware.api [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Waiting for the task: (returnval){ [ 1332.891456] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52dbf949-60c2-84cc-fc49-06e492a5d4e7" [ 1332.891456] env[68617]: _type = "Task" [ 1332.891456] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1332.905448] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1332.905667] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Creating directory with path [datastore2] vmware_temp/e330c5f5-a78b-4890-8b9f-ce0c40f17fdf/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1332.905859] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bbccde3c-7e32-4af5-9675-da1182da9ed5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1332.925520] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Created directory with path [datastore2] vmware_temp/e330c5f5-a78b-4890-8b9f-ce0c40f17fdf/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1332.925647] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Fetch image to [datastore2] vmware_temp/e330c5f5-a78b-4890-8b9f-ce0c40f17fdf/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1332.925811] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/e330c5f5-a78b-4890-8b9f-ce0c40f17fdf/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1332.926545] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b8f2ca7-70fc-4014-bb15-e133c48dfa36 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1332.933230] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05d5ebcb-f733-4760-b75f-8b3f08e2c028 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1332.941811] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dae3f619-cd58-4b7d-9d1b-46679be92e74 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1332.975277] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5449f3c-b2cb-4b01-ad20-628b79aaee64 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1332.977854] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1332.978056] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1332.978230] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Deleting the datastore file [datastore2] e6b6cbdd-11d6-44a6-8da7-98e0f52cef67 {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1332.978463] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-302f65ce-1f11-4bb7-8d7f-54f46c668efc {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1332.983256] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0988d72c-7b0d-4841-91e7-09d8b5f3f4f2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1332.986024] env[68617]: DEBUG oslo_vmware.api [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Waiting for the task: (returnval){ [ 1332.986024] env[68617]: value = "task-3470807" [ 1332.986024] env[68617]: _type = "Task" [ 1332.986024] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1332.993563] env[68617]: DEBUG oslo_vmware.api [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Task: {'id': task-3470807, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1333.012932] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1333.208208] env[68617]: DEBUG oslo_vmware.rw_handles [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e330c5f5-a78b-4890-8b9f-ce0c40f17fdf/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1333.269353] env[68617]: DEBUG oslo_vmware.rw_handles [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1333.269549] env[68617]: DEBUG oslo_vmware.rw_handles [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e330c5f5-a78b-4890-8b9f-ce0c40f17fdf/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1333.496235] env[68617]: DEBUG oslo_vmware.api [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Task: {'id': task-3470807, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066698} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1333.496547] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1333.496676] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1333.496846] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1333.497027] env[68617]: INFO nova.compute.manager [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1333.499322] env[68617]: DEBUG nova.compute.claims [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1333.499532] env[68617]: DEBUG oslo_concurrency.lockutils [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1333.499749] env[68617]: DEBUG oslo_concurrency.lockutils [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1333.864787] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d173f8c-75f9-4ded-8449-ccef0b4d6c56 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1333.872605] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf27718f-e786-4f7f-87e3-5a0967757ec9 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1333.902331] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36580521-fff0-4f8f-82c7-54a93c945ec9 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1333.909235] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcdc4d94-61fa-4276-9d3e-a60a783e74cc {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1333.923041] env[68617]: DEBUG nova.compute.provider_tree [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1333.933117] env[68617]: DEBUG nova.scheduler.client.report [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1333.949506] env[68617]: DEBUG oslo_concurrency.lockutils [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.449s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1333.949694] env[68617]: ERROR nova.compute.manager [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1333.949694] env[68617]: Faults: ['InvalidArgument'] [ 1333.949694] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Traceback (most recent call last): [ 1333.949694] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1333.949694] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] self.driver.spawn(context, instance, image_meta, [ 1333.949694] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1333.949694] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1333.949694] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1333.949694] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] self._fetch_image_if_missing(context, vi) [ 1333.949694] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1333.949694] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] image_cache(vi, tmp_image_ds_loc) [ 1333.949694] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1333.950080] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] vm_util.copy_virtual_disk( [ 1333.950080] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1333.950080] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] session._wait_for_task(vmdk_copy_task) [ 1333.950080] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1333.950080] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] return self.wait_for_task(task_ref) [ 1333.950080] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1333.950080] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] return evt.wait() [ 1333.950080] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1333.950080] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] result = hub.switch() [ 1333.950080] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1333.950080] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] return self.greenlet.switch() [ 1333.950080] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1333.950080] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] self.f(*self.args, **self.kw) [ 1333.950474] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1333.950474] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] raise exceptions.translate_fault(task_info.error) [ 1333.950474] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1333.950474] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Faults: ['InvalidArgument'] [ 1333.950474] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] [ 1333.950474] env[68617]: DEBUG nova.compute.utils [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1333.952041] env[68617]: DEBUG nova.compute.manager [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Build of instance e6b6cbdd-11d6-44a6-8da7-98e0f52cef67 was re-scheduled: A specified parameter was not correct: fileType [ 1333.952041] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1333.952222] env[68617]: DEBUG nova.compute.manager [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1333.952362] env[68617]: DEBUG nova.compute.manager [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1333.952516] env[68617]: DEBUG nova.compute.manager [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1333.952673] env[68617]: DEBUG nova.network.neutron [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1334.119376] env[68617]: DEBUG neutronclient.v2_0.client [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=68617) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1334.121500] env[68617]: ERROR nova.compute.manager [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1334.121500] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Traceback (most recent call last): [ 1334.121500] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1334.121500] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] self.driver.spawn(context, instance, image_meta, [ 1334.121500] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1334.121500] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1334.121500] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1334.121500] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] self._fetch_image_if_missing(context, vi) [ 1334.121500] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1334.121500] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] image_cache(vi, tmp_image_ds_loc) [ 1334.121500] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1334.121500] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] vm_util.copy_virtual_disk( [ 1334.121965] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1334.121965] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] session._wait_for_task(vmdk_copy_task) [ 1334.121965] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1334.121965] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] return self.wait_for_task(task_ref) [ 1334.121965] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1334.121965] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] return evt.wait() [ 1334.121965] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1334.121965] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] result = hub.switch() [ 1334.121965] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1334.121965] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] return self.greenlet.switch() [ 1334.121965] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1334.121965] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] self.f(*self.args, **self.kw) [ 1334.121965] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1334.122440] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] raise exceptions.translate_fault(task_info.error) [ 1334.122440] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1334.122440] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Faults: ['InvalidArgument'] [ 1334.122440] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] [ 1334.122440] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] During handling of the above exception, another exception occurred: [ 1334.122440] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] [ 1334.122440] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Traceback (most recent call last): [ 1334.122440] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/compute/manager.py", line 2430, in _do_build_and_run_instance [ 1334.122440] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] self._build_and_run_instance(context, instance, image, [ 1334.122440] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/compute/manager.py", line 2722, in _build_and_run_instance [ 1334.122440] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] raise exception.RescheduledException( [ 1334.122440] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] nova.exception.RescheduledException: Build of instance e6b6cbdd-11d6-44a6-8da7-98e0f52cef67 was re-scheduled: A specified parameter was not correct: fileType [ 1334.122440] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Faults: ['InvalidArgument'] [ 1334.122440] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] [ 1334.122910] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] During handling of the above exception, another exception occurred: [ 1334.122910] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] [ 1334.122910] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Traceback (most recent call last): [ 1334.122910] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1334.122910] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] ret = obj(*args, **kwargs) [ 1334.122910] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1334.122910] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] exception_handler_v20(status_code, error_body) [ 1334.122910] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1334.122910] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] raise client_exc(message=error_message, [ 1334.122910] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1334.122910] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Neutron server returns request_ids: ['req-578c9562-3fac-42c4-b14f-adfcdc3fefe8'] [ 1334.122910] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] [ 1334.122910] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] During handling of the above exception, another exception occurred: [ 1334.123432] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] [ 1334.123432] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Traceback (most recent call last): [ 1334.123432] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/compute/manager.py", line 3019, in _cleanup_allocated_networks [ 1334.123432] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] self._deallocate_network(context, instance, requested_networks) [ 1334.123432] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1334.123432] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] self.network_api.deallocate_for_instance( [ 1334.123432] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1334.123432] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] data = neutron.list_ports(**search_opts) [ 1334.123432] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1334.123432] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] ret = obj(*args, **kwargs) [ 1334.123432] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1334.123432] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] return self.list('ports', self.ports_path, retrieve_all, [ 1334.123432] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1334.123862] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] ret = obj(*args, **kwargs) [ 1334.123862] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1334.123862] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] for r in self._pagination(collection, path, **params): [ 1334.123862] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1334.123862] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] res = self.get(path, params=params) [ 1334.123862] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1334.123862] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] ret = obj(*args, **kwargs) [ 1334.123862] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1334.123862] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] return self.retry_request("GET", action, body=body, [ 1334.123862] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1334.123862] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] ret = obj(*args, **kwargs) [ 1334.123862] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1334.123862] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] return self.do_request(method, action, body=body, [ 1334.124704] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1334.124704] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] ret = obj(*args, **kwargs) [ 1334.124704] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1334.124704] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] self._handle_fault_response(status_code, replybody, resp) [ 1334.124704] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1334.124704] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] raise exception.Unauthorized() [ 1334.124704] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] nova.exception.Unauthorized: Not authorized. [ 1334.124704] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] [ 1334.189368] env[68617]: INFO nova.scheduler.client.report [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Deleted allocations for instance e6b6cbdd-11d6-44a6-8da7-98e0f52cef67 [ 1334.214608] env[68617]: DEBUG oslo_concurrency.lockutils [None req-62dd095b-729b-4cfc-bc66-2c61aef3aba9 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Lock "e6b6cbdd-11d6-44a6-8da7-98e0f52cef67" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 622.298s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1334.215722] env[68617]: DEBUG oslo_concurrency.lockutils [None req-02678a3c-83cc-4b14-9b23-d90f9e93cbd7 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Lock "e6b6cbdd-11d6-44a6-8da7-98e0f52cef67" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 424.383s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1334.215933] env[68617]: DEBUG oslo_concurrency.lockutils [None req-02678a3c-83cc-4b14-9b23-d90f9e93cbd7 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Acquiring lock "e6b6cbdd-11d6-44a6-8da7-98e0f52cef67-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1334.216151] env[68617]: DEBUG oslo_concurrency.lockutils [None req-02678a3c-83cc-4b14-9b23-d90f9e93cbd7 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Lock "e6b6cbdd-11d6-44a6-8da7-98e0f52cef67-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1334.216319] env[68617]: DEBUG oslo_concurrency.lockutils [None req-02678a3c-83cc-4b14-9b23-d90f9e93cbd7 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Lock "e6b6cbdd-11d6-44a6-8da7-98e0f52cef67-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1334.219955] env[68617]: INFO nova.compute.manager [None req-02678a3c-83cc-4b14-9b23-d90f9e93cbd7 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Terminating instance [ 1334.221434] env[68617]: DEBUG nova.compute.manager [None req-02678a3c-83cc-4b14-9b23-d90f9e93cbd7 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1334.221660] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-02678a3c-83cc-4b14-9b23-d90f9e93cbd7 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1334.221911] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-47695a18-3c64-4a23-9d4f-9a7e00b30889 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1334.231253] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28f849b3-6037-43d0-baa5-f9cf1c707af3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1334.246017] env[68617]: DEBUG nova.compute.manager [None req-0e1b8687-2cf3-4567-937f-3f76cec5553d tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 030eceb1-51a5-4e34-ad67-727b7ebd524f] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1334.267307] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-02678a3c-83cc-4b14-9b23-d90f9e93cbd7 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e6b6cbdd-11d6-44a6-8da7-98e0f52cef67 could not be found. [ 1334.267307] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-02678a3c-83cc-4b14-9b23-d90f9e93cbd7 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1334.267307] env[68617]: INFO nova.compute.manager [None req-02678a3c-83cc-4b14-9b23-d90f9e93cbd7 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1334.267307] env[68617]: DEBUG oslo.service.loopingcall [None req-02678a3c-83cc-4b14-9b23-d90f9e93cbd7 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1334.267464] env[68617]: DEBUG nova.compute.manager [-] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1334.267545] env[68617]: DEBUG nova.network.neutron [-] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1334.275982] env[68617]: DEBUG nova.compute.manager [None req-0e1b8687-2cf3-4567-937f-3f76cec5553d tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 030eceb1-51a5-4e34-ad67-727b7ebd524f] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1334.310352] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0e1b8687-2cf3-4567-937f-3f76cec5553d tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "030eceb1-51a5-4e34-ad67-727b7ebd524f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 228.557s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1334.334075] env[68617]: DEBUG nova.compute.manager [None req-ba5b7b45-1fa5-4ef2-8ea9-dc1c7e6ef22e tempest-FloatingIPsAssociationNegativeTestJSON-296212251 tempest-FloatingIPsAssociationNegativeTestJSON-296212251-project-member] [instance: 07927d19-2354-4215-b89d-5920e20b8222] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1334.358754] env[68617]: DEBUG nova.compute.manager [None req-ba5b7b45-1fa5-4ef2-8ea9-dc1c7e6ef22e tempest-FloatingIPsAssociationNegativeTestJSON-296212251 tempest-FloatingIPsAssociationNegativeTestJSON-296212251-project-member] [instance: 07927d19-2354-4215-b89d-5920e20b8222] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1334.383939] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ba5b7b45-1fa5-4ef2-8ea9-dc1c7e6ef22e tempest-FloatingIPsAssociationNegativeTestJSON-296212251 tempest-FloatingIPsAssociationNegativeTestJSON-296212251-project-member] Lock "07927d19-2354-4215-b89d-5920e20b8222" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 219.648s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1334.395877] env[68617]: DEBUG nova.compute.manager [None req-513971dc-01da-410d-ae01-53e625bf6a3c tempest-AttachInterfacesV270Test-135274226 tempest-AttachInterfacesV270Test-135274226-project-member] [instance: 59df690b-bfbb-4976-b80b-60106c53ba25] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1334.424499] env[68617]: DEBUG nova.compute.manager [None req-513971dc-01da-410d-ae01-53e625bf6a3c tempest-AttachInterfacesV270Test-135274226 tempest-AttachInterfacesV270Test-135274226-project-member] [instance: 59df690b-bfbb-4976-b80b-60106c53ba25] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1334.443070] env[68617]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=68617) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1334.443070] env[68617]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1334.448052] env[68617]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1334.448052] env[68617]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1334.448052] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1334.448052] env[68617]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1334.448052] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1334.448052] env[68617]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1334.448052] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1334.448052] env[68617]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1334.448052] env[68617]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1334.448052] env[68617]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-317e0c84-22de-4b77-adb1-8a396877c785'] [ 1334.448052] env[68617]: ERROR oslo.service.loopingcall [ 1334.448052] env[68617]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1334.448052] env[68617]: ERROR oslo.service.loopingcall [ 1334.448052] env[68617]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1334.448052] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1334.448052] env[68617]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1334.448974] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1334.448974] env[68617]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1334.448974] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1334.448974] env[68617]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1334.448974] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1334.448974] env[68617]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1334.448974] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1334.448974] env[68617]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1334.448974] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1334.448974] env[68617]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1334.448974] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1334.448974] env[68617]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1334.448974] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1334.448974] env[68617]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1334.448974] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1334.448974] env[68617]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1334.448974] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1334.448974] env[68617]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1334.449587] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1334.449587] env[68617]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1334.449587] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1334.449587] env[68617]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1334.449587] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1334.449587] env[68617]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1334.449587] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1334.449587] env[68617]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1334.449587] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1334.449587] env[68617]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1334.449587] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1334.449587] env[68617]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1334.449587] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1334.449587] env[68617]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1334.449587] env[68617]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1334.449587] env[68617]: ERROR oslo.service.loopingcall [ 1334.450924] env[68617]: ERROR nova.compute.manager [None req-02678a3c-83cc-4b14-9b23-d90f9e93cbd7 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1334.459926] env[68617]: DEBUG oslo_concurrency.lockutils [None req-513971dc-01da-410d-ae01-53e625bf6a3c tempest-AttachInterfacesV270Test-135274226 tempest-AttachInterfacesV270Test-135274226-project-member] Lock "59df690b-bfbb-4976-b80b-60106c53ba25" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 214.818s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1334.474652] env[68617]: DEBUG nova.compute.manager [None req-d40a1f2b-bd26-4fa5-9cc2-e377610bb628 tempest-ServersTestManualDisk-623166759 tempest-ServersTestManualDisk-623166759-project-member] [instance: 98b47fc9-678d-4c60-b9e5-78423719ae76] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1334.495819] env[68617]: ERROR nova.compute.manager [None req-02678a3c-83cc-4b14-9b23-d90f9e93cbd7 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1334.495819] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Traceback (most recent call last): [ 1334.495819] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1334.495819] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] ret = obj(*args, **kwargs) [ 1334.495819] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1334.495819] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] exception_handler_v20(status_code, error_body) [ 1334.495819] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1334.495819] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] raise client_exc(message=error_message, [ 1334.495819] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1334.495819] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Neutron server returns request_ids: ['req-317e0c84-22de-4b77-adb1-8a396877c785'] [ 1334.495819] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] [ 1334.496657] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] During handling of the above exception, another exception occurred: [ 1334.496657] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] [ 1334.496657] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Traceback (most recent call last): [ 1334.496657] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1334.496657] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] self._delete_instance(context, instance, bdms) [ 1334.496657] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1334.496657] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] self._shutdown_instance(context, instance, bdms) [ 1334.496657] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1334.496657] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] self._try_deallocate_network(context, instance, requested_networks) [ 1334.496657] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1334.496657] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] with excutils.save_and_reraise_exception(): [ 1334.496657] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1334.496657] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] self.force_reraise() [ 1334.497307] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1334.497307] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] raise self.value [ 1334.497307] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1334.497307] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] _deallocate_network_with_retries() [ 1334.497307] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1334.497307] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] return evt.wait() [ 1334.497307] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1334.497307] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] result = hub.switch() [ 1334.497307] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1334.497307] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] return self.greenlet.switch() [ 1334.497307] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1334.497307] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] result = func(*self.args, **self.kw) [ 1334.498200] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1334.498200] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] result = f(*args, **kwargs) [ 1334.498200] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1334.498200] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] self._deallocate_network( [ 1334.498200] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1334.498200] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] self.network_api.deallocate_for_instance( [ 1334.498200] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1334.498200] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] data = neutron.list_ports(**search_opts) [ 1334.498200] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1334.498200] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] ret = obj(*args, **kwargs) [ 1334.498200] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1334.498200] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] return self.list('ports', self.ports_path, retrieve_all, [ 1334.498200] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1334.498738] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] ret = obj(*args, **kwargs) [ 1334.498738] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1334.498738] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] for r in self._pagination(collection, path, **params): [ 1334.498738] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1334.498738] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] res = self.get(path, params=params) [ 1334.498738] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1334.498738] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] ret = obj(*args, **kwargs) [ 1334.498738] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1334.498738] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] return self.retry_request("GET", action, body=body, [ 1334.498738] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1334.498738] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] ret = obj(*args, **kwargs) [ 1334.498738] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1334.498738] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] return self.do_request(method, action, body=body, [ 1334.499163] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1334.499163] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] ret = obj(*args, **kwargs) [ 1334.499163] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1334.499163] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] self._handle_fault_response(status_code, replybody, resp) [ 1334.499163] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1334.499163] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1334.499163] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1334.499163] env[68617]: ERROR nova.compute.manager [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] [ 1334.509221] env[68617]: DEBUG nova.compute.manager [None req-d40a1f2b-bd26-4fa5-9cc2-e377610bb628 tempest-ServersTestManualDisk-623166759 tempest-ServersTestManualDisk-623166759-project-member] [instance: 98b47fc9-678d-4c60-b9e5-78423719ae76] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1334.531137] env[68617]: DEBUG oslo_concurrency.lockutils [None req-02678a3c-83cc-4b14-9b23-d90f9e93cbd7 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Lock "e6b6cbdd-11d6-44a6-8da7-98e0f52cef67" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.315s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1334.532309] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "e6b6cbdd-11d6-44a6-8da7-98e0f52cef67" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 44.992s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1334.532502] env[68617]: INFO nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] During sync_power_state the instance has a pending task (deleting). Skip. [ 1334.532672] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "e6b6cbdd-11d6-44a6-8da7-98e0f52cef67" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1334.534559] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d40a1f2b-bd26-4fa5-9cc2-e377610bb628 tempest-ServersTestManualDisk-623166759 tempest-ServersTestManualDisk-623166759-project-member] Lock "98b47fc9-678d-4c60-b9e5-78423719ae76" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 197.540s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1334.547341] env[68617]: DEBUG nova.compute.manager [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1334.595023] env[68617]: INFO nova.compute.manager [None req-02678a3c-83cc-4b14-9b23-d90f9e93cbd7 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] [instance: e6b6cbdd-11d6-44a6-8da7-98e0f52cef67] Successfully reverted task state from None on failure for instance. [ 1334.598633] env[68617]: ERROR oslo_messaging.rpc.server [None req-02678a3c-83cc-4b14-9b23-d90f9e93cbd7 tempest-ServerExternalEventsTest-1993835752 tempest-ServerExternalEventsTest-1993835752-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1334.598633] env[68617]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1334.598633] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1334.598633] env[68617]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1334.598633] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1334.598633] env[68617]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1334.598633] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1334.598633] env[68617]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1334.598633] env[68617]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1334.598633] env[68617]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-317e0c84-22de-4b77-adb1-8a396877c785'] [ 1334.598633] env[68617]: ERROR oslo_messaging.rpc.server [ 1334.598633] env[68617]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1334.598633] env[68617]: ERROR oslo_messaging.rpc.server [ 1334.598633] env[68617]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1334.598633] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1334.599144] env[68617]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1334.599144] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1334.599144] env[68617]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1334.599144] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1334.599144] env[68617]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1334.599144] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1334.599144] env[68617]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1334.599144] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1334.599144] env[68617]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1334.599144] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1334.599144] env[68617]: ERROR oslo_messaging.rpc.server raise self.value [ 1334.599144] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1334.599144] env[68617]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1334.599144] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1334.599144] env[68617]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1334.599144] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1334.599144] env[68617]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1334.599144] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1334.599681] env[68617]: ERROR oslo_messaging.rpc.server raise self.value [ 1334.599681] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1334.599681] env[68617]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1334.599681] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1334.599681] env[68617]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1334.599681] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1334.599681] env[68617]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1334.599681] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1334.599681] env[68617]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1334.599681] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1334.599681] env[68617]: ERROR oslo_messaging.rpc.server raise self.value [ 1334.599681] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1334.599681] env[68617]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1334.599681] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1334.599681] env[68617]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1334.599681] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1334.599681] env[68617]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1334.599681] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1334.600230] env[68617]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1334.600230] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1334.600230] env[68617]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1334.600230] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1334.600230] env[68617]: ERROR oslo_messaging.rpc.server raise self.value [ 1334.600230] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1334.600230] env[68617]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1334.600230] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1334.600230] env[68617]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1334.600230] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1334.600230] env[68617]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1334.600230] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1334.600230] env[68617]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1334.600230] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1334.600230] env[68617]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1334.600230] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1334.600230] env[68617]: ERROR oslo_messaging.rpc.server raise self.value [ 1334.600230] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1334.600784] env[68617]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1334.600784] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1334.600784] env[68617]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1334.600784] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1334.600784] env[68617]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1334.600784] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1334.600784] env[68617]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1334.600784] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1334.600784] env[68617]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1334.600784] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1334.600784] env[68617]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1334.600784] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1334.600784] env[68617]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1334.600784] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1334.600784] env[68617]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1334.600784] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1334.600784] env[68617]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1334.600784] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1334.601630] env[68617]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1334.601630] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1334.601630] env[68617]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1334.601630] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1334.601630] env[68617]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1334.601630] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1334.601630] env[68617]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1334.601630] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1334.601630] env[68617]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1334.601630] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1334.601630] env[68617]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1334.601630] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1334.601630] env[68617]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1334.601630] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1334.601630] env[68617]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1334.601630] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1334.601630] env[68617]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1334.601630] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1334.602865] env[68617]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1334.602865] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1334.602865] env[68617]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1334.602865] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1334.602865] env[68617]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1334.602865] env[68617]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1334.602865] env[68617]: ERROR oslo_messaging.rpc.server [ 1334.612662] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1334.616900] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1334.616900] env[68617]: INFO nova.compute.claims [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1335.006182] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc715bd8-c2b2-4fe4-8f7d-b07f70a1347f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1335.012311] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc3bcfcf-e20d-4561-831e-8571ede30bdf {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1335.044660] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04978902-0d83-45e1-ae38-9cca82944bd1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1335.053613] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f23573a-7069-4b93-a508-13980e72df28 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1335.068880] env[68617]: DEBUG nova.compute.provider_tree [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1335.086066] env[68617]: DEBUG nova.scheduler.client.report [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1335.101133] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.489s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1335.102079] env[68617]: DEBUG nova.compute.manager [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1335.149041] env[68617]: DEBUG nova.compute.utils [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1335.149041] env[68617]: DEBUG nova.compute.manager [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1335.149041] env[68617]: DEBUG nova.network.neutron [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1335.162236] env[68617]: DEBUG nova.compute.manager [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1335.229081] env[68617]: DEBUG nova.compute.manager [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1335.257668] env[68617]: DEBUG nova.virt.hardware [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1335.258111] env[68617]: DEBUG nova.virt.hardware [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1335.258380] env[68617]: DEBUG nova.virt.hardware [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1335.258682] env[68617]: DEBUG nova.virt.hardware [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1335.258964] env[68617]: DEBUG nova.virt.hardware [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1335.259251] env[68617]: DEBUG nova.virt.hardware [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1335.259568] env[68617]: DEBUG nova.virt.hardware [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1335.259831] env[68617]: DEBUG nova.virt.hardware [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1335.261807] env[68617]: DEBUG nova.virt.hardware [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1335.261807] env[68617]: DEBUG nova.virt.hardware [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1335.261807] env[68617]: DEBUG nova.virt.hardware [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1335.261807] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3e97163-1071-4d0e-a48e-9a5fb5a203a5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1335.270620] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-987308a4-75d9-4bee-a319-3f67dcc24f6b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1335.290819] env[68617]: DEBUG nova.policy [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9bf7626b69414219b4266196ece620ed', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7822d4c6136c45ca84919a3b6b308457', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 1335.945146] env[68617]: DEBUG nova.network.neutron [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Successfully created port: 4e55458f-3c93-43e7-b781-cceb29302260 {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1336.457282] env[68617]: DEBUG nova.compute.manager [req-1209a210-561b-4b09-8e31-ca50a6136b79 req-6dd329a0-eb20-423f-8f6a-7e0bebb7b365 service nova] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Received event network-vif-plugged-4e55458f-3c93-43e7-b781-cceb29302260 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1336.457542] env[68617]: DEBUG oslo_concurrency.lockutils [req-1209a210-561b-4b09-8e31-ca50a6136b79 req-6dd329a0-eb20-423f-8f6a-7e0bebb7b365 service nova] Acquiring lock "e90877a8-47d3-47d7-8362-5bcfe3a98c36-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1336.457785] env[68617]: DEBUG oslo_concurrency.lockutils [req-1209a210-561b-4b09-8e31-ca50a6136b79 req-6dd329a0-eb20-423f-8f6a-7e0bebb7b365 service nova] Lock "e90877a8-47d3-47d7-8362-5bcfe3a98c36-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1336.457958] env[68617]: DEBUG oslo_concurrency.lockutils [req-1209a210-561b-4b09-8e31-ca50a6136b79 req-6dd329a0-eb20-423f-8f6a-7e0bebb7b365 service nova] Lock "e90877a8-47d3-47d7-8362-5bcfe3a98c36-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1336.458202] env[68617]: DEBUG nova.compute.manager [req-1209a210-561b-4b09-8e31-ca50a6136b79 req-6dd329a0-eb20-423f-8f6a-7e0bebb7b365 service nova] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] No waiting events found dispatching network-vif-plugged-4e55458f-3c93-43e7-b781-cceb29302260 {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1336.458411] env[68617]: WARNING nova.compute.manager [req-1209a210-561b-4b09-8e31-ca50a6136b79 req-6dd329a0-eb20-423f-8f6a-7e0bebb7b365 service nova] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Received unexpected event network-vif-plugged-4e55458f-3c93-43e7-b781-cceb29302260 for instance with vm_state building and task_state spawning. [ 1336.533247] env[68617]: DEBUG nova.network.neutron [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Successfully updated port: 4e55458f-3c93-43e7-b781-cceb29302260 {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1336.543367] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Acquiring lock "refresh_cache-e90877a8-47d3-47d7-8362-5bcfe3a98c36" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1336.543513] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Acquired lock "refresh_cache-e90877a8-47d3-47d7-8362-5bcfe3a98c36" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1336.543657] env[68617]: DEBUG nova.network.neutron [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1336.587997] env[68617]: DEBUG nova.network.neutron [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1336.763954] env[68617]: DEBUG nova.network.neutron [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Updating instance_info_cache with network_info: [{"id": "4e55458f-3c93-43e7-b781-cceb29302260", "address": "fa:16:3e:6a:6a:fa", "network": {"id": "e3aee9db-8596-4ea8-943e-5c365382ee22", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "f1a3ab6230dd468b8019424ce71de8ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cde23701-02ca-4cb4-b5a6-d321f8ac9660", "external-id": "nsx-vlan-transportzone-586", "segmentation_id": 586, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4e55458f-3c", "ovs_interfaceid": "4e55458f-3c93-43e7-b781-cceb29302260", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1336.781127] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Releasing lock "refresh_cache-e90877a8-47d3-47d7-8362-5bcfe3a98c36" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1336.781426] env[68617]: DEBUG nova.compute.manager [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Instance network_info: |[{"id": "4e55458f-3c93-43e7-b781-cceb29302260", "address": "fa:16:3e:6a:6a:fa", "network": {"id": "e3aee9db-8596-4ea8-943e-5c365382ee22", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "f1a3ab6230dd468b8019424ce71de8ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cde23701-02ca-4cb4-b5a6-d321f8ac9660", "external-id": "nsx-vlan-transportzone-586", "segmentation_id": 586, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4e55458f-3c", "ovs_interfaceid": "4e55458f-3c93-43e7-b781-cceb29302260", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1336.781799] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6a:6a:fa', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'cde23701-02ca-4cb4-b5a6-d321f8ac9660', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '4e55458f-3c93-43e7-b781-cceb29302260', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1336.789397] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Creating folder: Project (7822d4c6136c45ca84919a3b6b308457). Parent ref: group-v693691. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1336.789897] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3ab9ab00-631d-4dec-9e9b-18147716ac51 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1336.800535] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Created folder: Project (7822d4c6136c45ca84919a3b6b308457) in parent group-v693691. [ 1336.800721] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Creating folder: Instances. Parent ref: group-v693761. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1336.800939] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6b24639f-edcd-487f-b881-2b560f7aee3e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1336.810408] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Created folder: Instances in parent group-v693761. [ 1336.810632] env[68617]: DEBUG oslo.service.loopingcall [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1336.810807] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1336.810993] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5ce8b9f3-93ce-4f28-bd84-5fa29d21c676 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1336.832441] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1336.832441] env[68617]: value = "task-3470810" [ 1336.832441] env[68617]: _type = "Task" [ 1336.832441] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1336.839821] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470810, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1337.342659] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470810, 'name': CreateVM_Task, 'duration_secs': 0.286102} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1337.342991] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1337.350154] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1337.350331] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1337.350656] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1337.350910] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a6420505-7bc8-41a7-be2e-3a86ccfacaaf {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1337.355827] env[68617]: DEBUG oslo_vmware.api [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Waiting for the task: (returnval){ [ 1337.355827] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52b72f9f-590b-592f-a0ff-4980948a887f" [ 1337.355827] env[68617]: _type = "Task" [ 1337.355827] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1337.363809] env[68617]: DEBUG oslo_vmware.api [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52b72f9f-590b-592f-a0ff-4980948a887f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1337.685799] env[68617]: DEBUG oslo_concurrency.lockutils [None req-dd4e098a-0a0c-4e84-aac8-2dcaf30a9e2f tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Acquiring lock "e90877a8-47d3-47d7-8362-5bcfe3a98c36" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1337.866021] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1337.866291] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1337.866503] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1338.589473] env[68617]: DEBUG nova.compute.manager [req-04335d11-6a24-44b7-b6f6-d4d5c38c955c req-1ab2d378-dfd2-46b5-b016-9b4473f50e21 service nova] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Received event network-changed-4e55458f-3c93-43e7-b781-cceb29302260 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1338.589732] env[68617]: DEBUG nova.compute.manager [req-04335d11-6a24-44b7-b6f6-d4d5c38c955c req-1ab2d378-dfd2-46b5-b016-9b4473f50e21 service nova] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Refreshing instance network info cache due to event network-changed-4e55458f-3c93-43e7-b781-cceb29302260. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1338.589966] env[68617]: DEBUG oslo_concurrency.lockutils [req-04335d11-6a24-44b7-b6f6-d4d5c38c955c req-1ab2d378-dfd2-46b5-b016-9b4473f50e21 service nova] Acquiring lock "refresh_cache-e90877a8-47d3-47d7-8362-5bcfe3a98c36" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1338.590044] env[68617]: DEBUG oslo_concurrency.lockutils [req-04335d11-6a24-44b7-b6f6-d4d5c38c955c req-1ab2d378-dfd2-46b5-b016-9b4473f50e21 service nova] Acquired lock "refresh_cache-e90877a8-47d3-47d7-8362-5bcfe3a98c36" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1338.590270] env[68617]: DEBUG nova.network.neutron [req-04335d11-6a24-44b7-b6f6-d4d5c38c955c req-1ab2d378-dfd2-46b5-b016-9b4473f50e21 service nova] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Refreshing network info cache for port 4e55458f-3c93-43e7-b781-cceb29302260 {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1338.870848] env[68617]: DEBUG nova.network.neutron [req-04335d11-6a24-44b7-b6f6-d4d5c38c955c req-1ab2d378-dfd2-46b5-b016-9b4473f50e21 service nova] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Updated VIF entry in instance network info cache for port 4e55458f-3c93-43e7-b781-cceb29302260. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1338.871234] env[68617]: DEBUG nova.network.neutron [req-04335d11-6a24-44b7-b6f6-d4d5c38c955c req-1ab2d378-dfd2-46b5-b016-9b4473f50e21 service nova] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Updating instance_info_cache with network_info: [{"id": "4e55458f-3c93-43e7-b781-cceb29302260", "address": "fa:16:3e:6a:6a:fa", "network": {"id": "e3aee9db-8596-4ea8-943e-5c365382ee22", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "f1a3ab6230dd468b8019424ce71de8ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cde23701-02ca-4cb4-b5a6-d321f8ac9660", "external-id": "nsx-vlan-transportzone-586", "segmentation_id": 586, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4e55458f-3c", "ovs_interfaceid": "4e55458f-3c93-43e7-b781-cceb29302260", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1338.881412] env[68617]: DEBUG oslo_concurrency.lockutils [req-04335d11-6a24-44b7-b6f6-d4d5c38c955c req-1ab2d378-dfd2-46b5-b016-9b4473f50e21 service nova] Releasing lock "refresh_cache-e90877a8-47d3-47d7-8362-5bcfe3a98c36" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1341.264516] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Acquiring lock "6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1341.264828] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Lock "6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1341.293050] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Acquiring lock "a019d654-82ed-4ef2-850f-39a1f324566a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1341.293305] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Lock "a019d654-82ed-4ef2-850f-39a1f324566a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1342.469833] env[68617]: DEBUG oslo_concurrency.lockutils [None req-15b7b7d0-4e37-4a51-8b47-8f5c30bc73d8 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "43495abf-8f99-4f51-81ca-80a43c266695" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1342.470138] env[68617]: DEBUG oslo_concurrency.lockutils [None req-15b7b7d0-4e37-4a51-8b47-8f5c30bc73d8 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "43495abf-8f99-4f51-81ca-80a43c266695" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1348.427193] env[68617]: DEBUG oslo_concurrency.lockutils [None req-bd9e7f4b-8a51-4064-946a-00dbae218b70 tempest-AttachVolumeTestJSON-339037198 tempest-AttachVolumeTestJSON-339037198-project-member] Acquiring lock "b9d0b85a-f0ac-4f9e-bec4-a82db0eb96c3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1348.427543] env[68617]: DEBUG oslo_concurrency.lockutils [None req-bd9e7f4b-8a51-4064-946a-00dbae218b70 tempest-AttachVolumeTestJSON-339037198 tempest-AttachVolumeTestJSON-339037198-project-member] Lock "b9d0b85a-f0ac-4f9e-bec4-a82db0eb96c3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1352.486726] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6be4c7fd-8e1f-4c7f-83a2-e9f158814247 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquiring lock "5d294d66-266f-4a0b-be49-5061fb65b226" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1352.487197] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6be4c7fd-8e1f-4c7f-83a2-e9f158814247 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Lock "5d294d66-266f-4a0b-be49-5061fb65b226" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1353.427528] env[68617]: DEBUG oslo_concurrency.lockutils [None req-18863ad0-b310-42ab-b5a2-bab78170f76b tempest-ImagesTestJSON-918330909 tempest-ImagesTestJSON-918330909-project-member] Acquiring lock "f8c0a514-7e7f-455a-b84d-9afc2957945c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1353.427528] env[68617]: DEBUG oslo_concurrency.lockutils [None req-18863ad0-b310-42ab-b5a2-bab78170f76b tempest-ImagesTestJSON-918330909 tempest-ImagesTestJSON-918330909-project-member] Lock "f8c0a514-7e7f-455a-b84d-9afc2957945c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1358.188340] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c02501af-c000-48aa-ada4-d670a1fa0355 tempest-ServersV294TestFqdnHostnames-114980253 tempest-ServersV294TestFqdnHostnames-114980253-project-member] Acquiring lock "9ca297f6-3239-48d3-9b67-dd1637a3bc25" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1358.188705] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c02501af-c000-48aa-ada4-d670a1fa0355 tempest-ServersV294TestFqdnHostnames-114980253 tempest-ServersV294TestFqdnHostnames-114980253-project-member] Lock "9ca297f6-3239-48d3-9b67-dd1637a3bc25" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1367.487633] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1367.489492] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Starting heal instance info cache {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1367.489492] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Rebuilding the list of instances to heal {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1367.518711] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1367.519127] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1367.519370] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1367.519667] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1367.519913] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1367.520172] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1367.520427] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1367.520679] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1367.521880] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1367.521880] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1367.521880] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Didn't find any instances for network info cache update. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1368.698779] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1369.694963] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1369.725369] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1370.699105] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1370.699360] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1371.643592] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5a02a231-4f41-46be-a700-c796ffc4183b tempest-ServersNegativeTestMultiTenantJSON-1012065245 tempest-ServersNegativeTestMultiTenantJSON-1012065245-project-member] Acquiring lock "57cdcf44-576a-4343-9277-4b9ebb2b194a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1371.643943] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5a02a231-4f41-46be-a700-c796ffc4183b tempest-ServersNegativeTestMultiTenantJSON-1012065245 tempest-ServersNegativeTestMultiTenantJSON-1012065245-project-member] Lock "57cdcf44-576a-4343-9277-4b9ebb2b194a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1371.698783] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1371.698955] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68617) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1373.698933] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1375.699093] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1375.711084] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1375.711084] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1375.711084] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1375.711084] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1375.711959] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2de0b71-d88d-4670-81a0-f268f96ccb36 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1375.722507] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-219c3c34-ab8a-4601-a010-e4c6212e769b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1375.736609] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4141324-b71e-4dfc-a59c-d277ad1a1361 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1375.743066] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54fb4eb8-c09d-4189-bdd6-7152a353a824 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1375.771466] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180892MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1375.771620] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1375.771819] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1375.850841] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b27ace75-e2fa-4acc-96cb-88dd49b89de5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1375.851042] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 995585f5-57a4-4ba6-9e28-18a086af264c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1375.851178] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 82864ac3-a199-478c-8c57-97ea0a256201 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1375.851301] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1375.851420] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 79c92a1b-20ef-4360-93b4-913cbfcf92fe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1375.851537] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 1cc42c7f-8781-40b0-9f75-edfef3bc90e7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1375.851652] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance d46ca6f3-0ee9-412c-98b4-f639ce4f9228 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1375.851765] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance a8ff6232-530c-453a-96e4-f8ce00f976e3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1375.851878] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1375.851994] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance e90877a8-47d3-47d7-8362-5bcfe3a98c36 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1375.862984] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f03b9bc5-9438-4c0c-b595-72c631bece08 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1375.873558] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 570302ee-2383-4659-80e1-af4b16d03a21 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1375.884187] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 7bf75617-fcd8-4d96-bf02-ddb723e8ad96 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1375.893669] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance ee6efd93-25be-4268-afe9-ba39e543a4fb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1375.905213] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 1605028f-5d6d-4ac4-8416-c0465982c53a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1375.915444] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance fc1043b8-535d-4af0-b92b-1f43580cdc9a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1375.926166] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1375.936056] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance a019d654-82ed-4ef2-850f-39a1f324566a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1375.946127] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 43495abf-8f99-4f51-81ca-80a43c266695 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1375.955705] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b9d0b85a-f0ac-4f9e-bec4-a82db0eb96c3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1375.964656] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 5d294d66-266f-4a0b-be49-5061fb65b226 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1375.974103] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f8c0a514-7e7f-455a-b84d-9afc2957945c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1375.984129] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 9ca297f6-3239-48d3-9b67-dd1637a3bc25 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1375.994033] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 57cdcf44-576a-4343-9277-4b9ebb2b194a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1375.994033] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1375.994236] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1856MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1376.248743] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-931d9cef-a183-492f-9a45-a8794b7990de {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1376.256667] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73acc764-f9d4-49be-b965-bbb07f90d2a6 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1376.286539] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1400e54e-6583-46f7-8c51-ce59b9ce77c5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1376.293672] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77adc033-714f-4a2f-8bf6-704ca2873fc8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1376.306170] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1376.314178] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1376.327119] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1376.327305] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.555s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1377.322581] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1382.363018] env[68617]: WARNING oslo_vmware.rw_handles [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1382.363018] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1382.363018] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1382.363018] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1382.363018] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1382.363018] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 1382.363018] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1382.363018] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1382.363018] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1382.363018] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1382.363018] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1382.363018] env[68617]: ERROR oslo_vmware.rw_handles [ 1382.363558] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/e330c5f5-a78b-4890-8b9f-ce0c40f17fdf/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1382.365847] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1382.366125] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Copying Virtual Disk [datastore2] vmware_temp/e330c5f5-a78b-4890-8b9f-ce0c40f17fdf/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/e330c5f5-a78b-4890-8b9f-ce0c40f17fdf/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1382.366414] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-b21e0274-cb42-4c12-8a00-2c7d8e374078 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1382.375116] env[68617]: DEBUG oslo_vmware.api [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Waiting for the task: (returnval){ [ 1382.375116] env[68617]: value = "task-3470811" [ 1382.375116] env[68617]: _type = "Task" [ 1382.375116] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1382.382595] env[68617]: DEBUG oslo_vmware.api [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Task: {'id': task-3470811, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1382.885548] env[68617]: DEBUG oslo_vmware.exceptions [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1382.885846] env[68617]: DEBUG oslo_concurrency.lockutils [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1382.886403] env[68617]: ERROR nova.compute.manager [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1382.886403] env[68617]: Faults: ['InvalidArgument'] [ 1382.886403] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Traceback (most recent call last): [ 1382.886403] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1382.886403] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] yield resources [ 1382.886403] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1382.886403] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] self.driver.spawn(context, instance, image_meta, [ 1382.886403] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1382.886403] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1382.886403] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1382.886403] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] self._fetch_image_if_missing(context, vi) [ 1382.886403] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1382.886682] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] image_cache(vi, tmp_image_ds_loc) [ 1382.886682] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1382.886682] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] vm_util.copy_virtual_disk( [ 1382.886682] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1382.886682] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] session._wait_for_task(vmdk_copy_task) [ 1382.886682] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1382.886682] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] return self.wait_for_task(task_ref) [ 1382.886682] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1382.886682] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] return evt.wait() [ 1382.886682] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1382.886682] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] result = hub.switch() [ 1382.886682] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1382.886682] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] return self.greenlet.switch() [ 1382.886998] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1382.886998] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] self.f(*self.args, **self.kw) [ 1382.886998] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1382.886998] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] raise exceptions.translate_fault(task_info.error) [ 1382.886998] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1382.886998] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Faults: ['InvalidArgument'] [ 1382.886998] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] [ 1382.886998] env[68617]: INFO nova.compute.manager [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Terminating instance [ 1382.888228] env[68617]: DEBUG oslo_concurrency.lockutils [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1382.888434] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1382.889091] env[68617]: DEBUG nova.compute.manager [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1382.889286] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1382.889510] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f1e4ecd9-a53f-447d-ac28-5686e0b2711f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1382.891757] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f46369c5-bbb5-4237-aa98-5a5d4e77d76e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1382.898618] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1382.898855] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d21f18ee-74dd-448d-8e5f-0311d2f72f14 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1382.901021] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1382.901167] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1382.902096] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-14d860b2-cc4d-42a9-b052-49888bd2d884 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1382.906934] env[68617]: DEBUG oslo_vmware.api [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Waiting for the task: (returnval){ [ 1382.906934] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]5288143c-dced-ce9d-59f9-30b85aac6f63" [ 1382.906934] env[68617]: _type = "Task" [ 1382.906934] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1382.914098] env[68617]: DEBUG oslo_vmware.api [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]5288143c-dced-ce9d-59f9-30b85aac6f63, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1382.972050] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1382.972333] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1382.972593] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Deleting the datastore file [datastore2] b27ace75-e2fa-4acc-96cb-88dd49b89de5 {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1382.973041] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f9445a52-b97f-4ae8-b3a8-b10d91496d1c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1382.979784] env[68617]: DEBUG oslo_vmware.api [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Waiting for the task: (returnval){ [ 1382.979784] env[68617]: value = "task-3470813" [ 1382.979784] env[68617]: _type = "Task" [ 1382.979784] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1382.987905] env[68617]: DEBUG oslo_vmware.api [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Task: {'id': task-3470813, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1383.417729] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1383.418186] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Creating directory with path [datastore2] vmware_temp/df9882af-be91-4544-a0d9-72d64e57d0c6/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1383.418493] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6c7b34e0-7868-4acd-a5b3-23bfec734845 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.430377] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Created directory with path [datastore2] vmware_temp/df9882af-be91-4544-a0d9-72d64e57d0c6/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1383.430577] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Fetch image to [datastore2] vmware_temp/df9882af-be91-4544-a0d9-72d64e57d0c6/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1383.430751] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/df9882af-be91-4544-a0d9-72d64e57d0c6/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1383.431495] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8df3a506-f54d-416e-ac1e-e970922642d1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.439667] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdb69fca-6896-4736-959c-db5710732cfa {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.449394] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26ef2be1-ca7b-4160-a5b3-471ce764771b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.483665] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb45b2a4-5d9a-4f3f-bb37-56184c4e571c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.490464] env[68617]: DEBUG oslo_vmware.api [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Task: {'id': task-3470813, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.06469} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1383.491906] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1383.492116] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1383.492304] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1383.492511] env[68617]: INFO nova.compute.manager [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1383.494559] env[68617]: DEBUG nova.compute.claims [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1383.494731] env[68617]: DEBUG oslo_concurrency.lockutils [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1383.494939] env[68617]: DEBUG oslo_concurrency.lockutils [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1383.497455] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8f0178d4-2d32-4a96-906f-f32b76842e9f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.519510] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1383.572133] env[68617]: DEBUG oslo_vmware.rw_handles [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/df9882af-be91-4544-a0d9-72d64e57d0c6/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1383.633088] env[68617]: DEBUG oslo_vmware.rw_handles [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1383.633177] env[68617]: DEBUG oslo_vmware.rw_handles [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/df9882af-be91-4544-a0d9-72d64e57d0c6/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1383.880779] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8881ca7d-d7d6-4136-8646-c08951f4f612 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.888244] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6378045-122c-4ed1-8c5f-46c383a865d7 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.917216] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8aedb45e-7341-4465-98de-0be72aa0d3ee {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.924042] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68186479-aa4e-456c-85af-3450fe693e06 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.936716] env[68617]: DEBUG nova.compute.provider_tree [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1383.948983] env[68617]: DEBUG nova.scheduler.client.report [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1383.961783] env[68617]: DEBUG oslo_concurrency.lockutils [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.467s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1383.962379] env[68617]: ERROR nova.compute.manager [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1383.962379] env[68617]: Faults: ['InvalidArgument'] [ 1383.962379] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Traceback (most recent call last): [ 1383.962379] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1383.962379] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] self.driver.spawn(context, instance, image_meta, [ 1383.962379] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1383.962379] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1383.962379] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1383.962379] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] self._fetch_image_if_missing(context, vi) [ 1383.962379] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1383.962379] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] image_cache(vi, tmp_image_ds_loc) [ 1383.962379] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1383.962707] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] vm_util.copy_virtual_disk( [ 1383.962707] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1383.962707] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] session._wait_for_task(vmdk_copy_task) [ 1383.962707] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1383.962707] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] return self.wait_for_task(task_ref) [ 1383.962707] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1383.962707] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] return evt.wait() [ 1383.962707] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1383.962707] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] result = hub.switch() [ 1383.962707] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1383.962707] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] return self.greenlet.switch() [ 1383.962707] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1383.962707] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] self.f(*self.args, **self.kw) [ 1383.963025] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1383.963025] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] raise exceptions.translate_fault(task_info.error) [ 1383.963025] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1383.963025] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Faults: ['InvalidArgument'] [ 1383.963025] env[68617]: ERROR nova.compute.manager [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] [ 1383.963310] env[68617]: DEBUG nova.compute.utils [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1383.964808] env[68617]: DEBUG nova.compute.manager [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Build of instance b27ace75-e2fa-4acc-96cb-88dd49b89de5 was re-scheduled: A specified parameter was not correct: fileType [ 1383.964808] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1383.965257] env[68617]: DEBUG nova.compute.manager [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1383.965471] env[68617]: DEBUG nova.compute.manager [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1383.965719] env[68617]: DEBUG nova.compute.manager [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1383.965976] env[68617]: DEBUG nova.network.neutron [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1384.280326] env[68617]: DEBUG nova.network.neutron [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1384.292938] env[68617]: INFO nova.compute.manager [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Took 0.33 seconds to deallocate network for instance. [ 1384.383103] env[68617]: INFO nova.scheduler.client.report [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Deleted allocations for instance b27ace75-e2fa-4acc-96cb-88dd49b89de5 [ 1384.402358] env[68617]: DEBUG oslo_concurrency.lockutils [None req-4c698103-945f-455a-9ca4-4e86c4a2193b tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Lock "b27ace75-e2fa-4acc-96cb-88dd49b89de5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 671.823s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1384.403675] env[68617]: DEBUG oslo_concurrency.lockutils [None req-510bb851-f7ec-43a9-98d3-f0340658b54a tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Lock "b27ace75-e2fa-4acc-96cb-88dd49b89de5" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 473.537s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1384.403945] env[68617]: DEBUG oslo_concurrency.lockutils [None req-510bb851-f7ec-43a9-98d3-f0340658b54a tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Acquiring lock "b27ace75-e2fa-4acc-96cb-88dd49b89de5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1384.404190] env[68617]: DEBUG oslo_concurrency.lockutils [None req-510bb851-f7ec-43a9-98d3-f0340658b54a tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Lock "b27ace75-e2fa-4acc-96cb-88dd49b89de5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1384.404408] env[68617]: DEBUG oslo_concurrency.lockutils [None req-510bb851-f7ec-43a9-98d3-f0340658b54a tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Lock "b27ace75-e2fa-4acc-96cb-88dd49b89de5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1384.406492] env[68617]: INFO nova.compute.manager [None req-510bb851-f7ec-43a9-98d3-f0340658b54a tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Terminating instance [ 1384.408239] env[68617]: DEBUG nova.compute.manager [None req-510bb851-f7ec-43a9-98d3-f0340658b54a tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1384.408452] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-510bb851-f7ec-43a9-98d3-f0340658b54a tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1384.408944] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a7266db3-6b30-405a-aaf0-95026bd245dd {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1384.418265] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59a02f1f-1c15-47bd-9ef8-dc3bc9237823 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1384.431303] env[68617]: DEBUG nova.compute.manager [None req-277d5d4d-ea80-411f-8178-19d3676a982d tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: c5764a1d-3370-4756-ada0-03b503368d17] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1384.453359] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-510bb851-f7ec-43a9-98d3-f0340658b54a tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b27ace75-e2fa-4acc-96cb-88dd49b89de5 could not be found. [ 1384.453583] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-510bb851-f7ec-43a9-98d3-f0340658b54a tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1384.453767] env[68617]: INFO nova.compute.manager [None req-510bb851-f7ec-43a9-98d3-f0340658b54a tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1384.454043] env[68617]: DEBUG oslo.service.loopingcall [None req-510bb851-f7ec-43a9-98d3-f0340658b54a tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1384.454297] env[68617]: DEBUG nova.compute.manager [-] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1384.454398] env[68617]: DEBUG nova.network.neutron [-] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1384.456822] env[68617]: DEBUG nova.compute.manager [None req-277d5d4d-ea80-411f-8178-19d3676a982d tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: c5764a1d-3370-4756-ada0-03b503368d17] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1384.477593] env[68617]: DEBUG oslo_concurrency.lockutils [None req-277d5d4d-ea80-411f-8178-19d3676a982d tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Lock "c5764a1d-3370-4756-ada0-03b503368d17" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 240.759s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1384.481333] env[68617]: DEBUG nova.network.neutron [-] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1384.486214] env[68617]: DEBUG nova.compute.manager [None req-277d5d4d-ea80-411f-8178-19d3676a982d tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: c0528a20-34cb-4b51-bb4c-8c3828021a85] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1384.488728] env[68617]: INFO nova.compute.manager [-] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] Took 0.03 seconds to deallocate network for instance. [ 1384.507331] env[68617]: DEBUG nova.compute.manager [None req-277d5d4d-ea80-411f-8178-19d3676a982d tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: c0528a20-34cb-4b51-bb4c-8c3828021a85] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1384.527893] env[68617]: DEBUG oslo_concurrency.lockutils [None req-277d5d4d-ea80-411f-8178-19d3676a982d tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Lock "c0528a20-34cb-4b51-bb4c-8c3828021a85" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 240.766s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1384.535749] env[68617]: DEBUG nova.compute.manager [None req-428077de-3a63-4b0c-a517-f64d25193b26 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: fa9b2716-783b-4b19-bfc9-aad609c3a659] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1384.558996] env[68617]: DEBUG nova.compute.manager [None req-428077de-3a63-4b0c-a517-f64d25193b26 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: fa9b2716-783b-4b19-bfc9-aad609c3a659] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1384.579811] env[68617]: DEBUG oslo_concurrency.lockutils [None req-510bb851-f7ec-43a9-98d3-f0340658b54a tempest-ListServerFiltersTestJSON-136232528 tempest-ListServerFiltersTestJSON-136232528-project-member] Lock "b27ace75-e2fa-4acc-96cb-88dd49b89de5" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.176s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1384.581119] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "b27ace75-e2fa-4acc-96cb-88dd49b89de5" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 95.040s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1384.581119] env[68617]: INFO nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: b27ace75-e2fa-4acc-96cb-88dd49b89de5] During sync_power_state the instance has a pending task (deleting). Skip. [ 1384.581119] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "b27ace75-e2fa-4acc-96cb-88dd49b89de5" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1384.585585] env[68617]: DEBUG oslo_concurrency.lockutils [None req-428077de-3a63-4b0c-a517-f64d25193b26 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "fa9b2716-783b-4b19-bfc9-aad609c3a659" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 238.877s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1384.594594] env[68617]: DEBUG nova.compute.manager [None req-514c2fd6-8e81-4a02-9b43-0cce1a26c8db tempest-ServerActionsV293TestJSON-754830659 tempest-ServerActionsV293TestJSON-754830659-project-member] [instance: dd611e75-aac1-4cdb-b263-6956d6254743] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1384.619116] env[68617]: DEBUG nova.compute.manager [None req-514c2fd6-8e81-4a02-9b43-0cce1a26c8db tempest-ServerActionsV293TestJSON-754830659 tempest-ServerActionsV293TestJSON-754830659-project-member] [instance: dd611e75-aac1-4cdb-b263-6956d6254743] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1384.657835] env[68617]: DEBUG oslo_concurrency.lockutils [None req-514c2fd6-8e81-4a02-9b43-0cce1a26c8db tempest-ServerActionsV293TestJSON-754830659 tempest-ServerActionsV293TestJSON-754830659-project-member] Lock "dd611e75-aac1-4cdb-b263-6956d6254743" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 237.750s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1384.668510] env[68617]: DEBUG nova.compute.manager [None req-21a980d9-c4a6-49bd-8bad-a84fc36b0223 tempest-AttachVolumeTestJSON-339037198 tempest-AttachVolumeTestJSON-339037198-project-member] [instance: 075eb6cb-a53b-44d9-986d-bc85d4b8ac25] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1384.695819] env[68617]: DEBUG nova.compute.manager [None req-21a980d9-c4a6-49bd-8bad-a84fc36b0223 tempest-AttachVolumeTestJSON-339037198 tempest-AttachVolumeTestJSON-339037198-project-member] [instance: 075eb6cb-a53b-44d9-986d-bc85d4b8ac25] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1384.715815] env[68617]: DEBUG oslo_concurrency.lockutils [None req-21a980d9-c4a6-49bd-8bad-a84fc36b0223 tempest-AttachVolumeTestJSON-339037198 tempest-AttachVolumeTestJSON-339037198-project-member] Lock "075eb6cb-a53b-44d9-986d-bc85d4b8ac25" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 234.541s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1384.724445] env[68617]: DEBUG nova.compute.manager [None req-b78ff9d6-5247-44ba-96c0-619c412e50d9 tempest-AttachInterfacesTestJSON-753337404 tempest-AttachInterfacesTestJSON-753337404-project-member] [instance: 65014c6f-8b4e-4468-9462-4b8cdc08af73] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1384.746973] env[68617]: DEBUG nova.compute.manager [None req-b78ff9d6-5247-44ba-96c0-619c412e50d9 tempest-AttachInterfacesTestJSON-753337404 tempest-AttachInterfacesTestJSON-753337404-project-member] [instance: 65014c6f-8b4e-4468-9462-4b8cdc08af73] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1384.766663] env[68617]: DEBUG oslo_concurrency.lockutils [None req-b78ff9d6-5247-44ba-96c0-619c412e50d9 tempest-AttachInterfacesTestJSON-753337404 tempest-AttachInterfacesTestJSON-753337404-project-member] Lock "65014c6f-8b4e-4468-9462-4b8cdc08af73" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 231.006s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1384.774446] env[68617]: DEBUG nova.compute.manager [None req-a6159447-108b-431a-a879-8a4ec5c03363 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 7e1c7e8a-139e-4e8a-a3e1-39a2d7c3fc47] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1384.795922] env[68617]: DEBUG nova.compute.manager [None req-a6159447-108b-431a-a879-8a4ec5c03363 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 7e1c7e8a-139e-4e8a-a3e1-39a2d7c3fc47] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1384.814499] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a6159447-108b-431a-a879-8a4ec5c03363 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Lock "7e1c7e8a-139e-4e8a-a3e1-39a2d7c3fc47" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 229.628s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1384.822746] env[68617]: DEBUG nova.compute.manager [None req-4261bb17-68ab-4b31-99d3-638d8a02ef5f tempest-ImagesTestJSON-918330909 tempest-ImagesTestJSON-918330909-project-member] [instance: 2bffd2c4-f290-4df6-b7b6-6dd963befdab] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1384.843573] env[68617]: DEBUG nova.compute.manager [None req-4261bb17-68ab-4b31-99d3-638d8a02ef5f tempest-ImagesTestJSON-918330909 tempest-ImagesTestJSON-918330909-project-member] [instance: 2bffd2c4-f290-4df6-b7b6-6dd963befdab] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1384.863456] env[68617]: DEBUG oslo_concurrency.lockutils [None req-4261bb17-68ab-4b31-99d3-638d8a02ef5f tempest-ImagesTestJSON-918330909 tempest-ImagesTestJSON-918330909-project-member] Lock "2bffd2c4-f290-4df6-b7b6-6dd963befdab" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 228.566s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1384.871234] env[68617]: DEBUG nova.compute.manager [None req-15c8b2c8-3704-406d-85e8-cb2c2467602c tempest-ServerActionsTestOtherB-1124123640 tempest-ServerActionsTestOtherB-1124123640-project-member] [instance: 13d6e00b-3c18-4346-b229-b56bdaba2dc8] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1384.893502] env[68617]: DEBUG nova.compute.manager [None req-15c8b2c8-3704-406d-85e8-cb2c2467602c tempest-ServerActionsTestOtherB-1124123640 tempest-ServerActionsTestOtherB-1124123640-project-member] [instance: 13d6e00b-3c18-4346-b229-b56bdaba2dc8] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1384.912477] env[68617]: DEBUG oslo_concurrency.lockutils [None req-15c8b2c8-3704-406d-85e8-cb2c2467602c tempest-ServerActionsTestOtherB-1124123640 tempest-ServerActionsTestOtherB-1124123640-project-member] Lock "13d6e00b-3c18-4346-b229-b56bdaba2dc8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 220.255s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1384.920469] env[68617]: DEBUG nova.compute.manager [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1384.977539] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1384.977539] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1384.979331] env[68617]: INFO nova.compute.claims [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1385.285642] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48d4dfd3-4e53-4127-a617-b98c1ee30891 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1385.293641] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a838d19-62bf-473e-ae35-47337dc4a8d9 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1385.323097] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28cf105b-7fd4-4869-bf59-6c90cb250df7 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1385.330163] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1da56d9a-3421-4855-b2b7-21a1f19ef2e7 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1385.342884] env[68617]: DEBUG nova.compute.provider_tree [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1385.352780] env[68617]: DEBUG nova.scheduler.client.report [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1385.365945] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.388s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1385.366420] env[68617]: DEBUG nova.compute.manager [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1385.398675] env[68617]: DEBUG nova.compute.utils [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1385.400038] env[68617]: DEBUG nova.compute.manager [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1385.400213] env[68617]: DEBUG nova.network.neutron [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1385.408852] env[68617]: DEBUG nova.compute.manager [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1385.478035] env[68617]: DEBUG nova.compute.manager [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1385.503165] env[68617]: DEBUG nova.virt.hardware [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1385.503427] env[68617]: DEBUG nova.virt.hardware [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1385.503582] env[68617]: DEBUG nova.virt.hardware [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1385.503759] env[68617]: DEBUG nova.virt.hardware [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1385.503995] env[68617]: DEBUG nova.virt.hardware [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1385.504179] env[68617]: DEBUG nova.virt.hardware [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1385.504396] env[68617]: DEBUG nova.virt.hardware [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1385.504557] env[68617]: DEBUG nova.virt.hardware [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1385.504724] env[68617]: DEBUG nova.virt.hardware [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1385.504886] env[68617]: DEBUG nova.virt.hardware [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1385.505070] env[68617]: DEBUG nova.virt.hardware [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1385.505923] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e2842c8-997e-4ca0-8e1a-14c116d6e249 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1385.514401] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46d6875d-508e-4f6f-bfa7-6f4964965f84 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1385.634336] env[68617]: DEBUG nova.policy [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3e3e6fa7da72463faa4f9568ff97776a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c07119c006e84a66bf7a37c1920f3694', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 1385.942873] env[68617]: DEBUG nova.network.neutron [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Successfully created port: 1dba5bab-11dc-47bf-958d-716cbf168ed8 {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1386.489825] env[68617]: DEBUG nova.compute.manager [req-6a7085c6-866c-4a56-b8f6-7f38e5accb24 req-cd983b7d-bd1a-4d7b-9514-049777403789 service nova] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Received event network-vif-plugged-1dba5bab-11dc-47bf-958d-716cbf168ed8 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1386.490214] env[68617]: DEBUG oslo_concurrency.lockutils [req-6a7085c6-866c-4a56-b8f6-7f38e5accb24 req-cd983b7d-bd1a-4d7b-9514-049777403789 service nova] Acquiring lock "f03b9bc5-9438-4c0c-b595-72c631bece08-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1386.490261] env[68617]: DEBUG oslo_concurrency.lockutils [req-6a7085c6-866c-4a56-b8f6-7f38e5accb24 req-cd983b7d-bd1a-4d7b-9514-049777403789 service nova] Lock "f03b9bc5-9438-4c0c-b595-72c631bece08-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1386.490425] env[68617]: DEBUG oslo_concurrency.lockutils [req-6a7085c6-866c-4a56-b8f6-7f38e5accb24 req-cd983b7d-bd1a-4d7b-9514-049777403789 service nova] Lock "f03b9bc5-9438-4c0c-b595-72c631bece08-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1386.490586] env[68617]: DEBUG nova.compute.manager [req-6a7085c6-866c-4a56-b8f6-7f38e5accb24 req-cd983b7d-bd1a-4d7b-9514-049777403789 service nova] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] No waiting events found dispatching network-vif-plugged-1dba5bab-11dc-47bf-958d-716cbf168ed8 {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1386.490936] env[68617]: WARNING nova.compute.manager [req-6a7085c6-866c-4a56-b8f6-7f38e5accb24 req-cd983b7d-bd1a-4d7b-9514-049777403789 service nova] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Received unexpected event network-vif-plugged-1dba5bab-11dc-47bf-958d-716cbf168ed8 for instance with vm_state building and task_state spawning. [ 1386.562352] env[68617]: DEBUG nova.network.neutron [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Successfully updated port: 1dba5bab-11dc-47bf-958d-716cbf168ed8 {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1386.577585] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Acquiring lock "refresh_cache-f03b9bc5-9438-4c0c-b595-72c631bece08" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1386.577727] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Acquired lock "refresh_cache-f03b9bc5-9438-4c0c-b595-72c631bece08" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1386.577877] env[68617]: DEBUG nova.network.neutron [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1386.619056] env[68617]: DEBUG nova.network.neutron [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1386.791977] env[68617]: DEBUG nova.network.neutron [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Updating instance_info_cache with network_info: [{"id": "1dba5bab-11dc-47bf-958d-716cbf168ed8", "address": "fa:16:3e:6a:6e:c1", "network": {"id": "65bec07e-2fec-40ce-ac24-d75d61493fed", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-174756301-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c07119c006e84a66bf7a37c1920f3694", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "79ece966-6187-47d7-bce7-cc39df14ac67", "external-id": "nsx-vlan-transportzone-472", "segmentation_id": 472, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1dba5bab-11", "ovs_interfaceid": "1dba5bab-11dc-47bf-958d-716cbf168ed8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1386.803744] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Releasing lock "refresh_cache-f03b9bc5-9438-4c0c-b595-72c631bece08" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1386.803744] env[68617]: DEBUG nova.compute.manager [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Instance network_info: |[{"id": "1dba5bab-11dc-47bf-958d-716cbf168ed8", "address": "fa:16:3e:6a:6e:c1", "network": {"id": "65bec07e-2fec-40ce-ac24-d75d61493fed", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-174756301-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c07119c006e84a66bf7a37c1920f3694", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "79ece966-6187-47d7-bce7-cc39df14ac67", "external-id": "nsx-vlan-transportzone-472", "segmentation_id": 472, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1dba5bab-11", "ovs_interfaceid": "1dba5bab-11dc-47bf-958d-716cbf168ed8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1386.804435] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6a:6e:c1', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '79ece966-6187-47d7-bce7-cc39df14ac67', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1dba5bab-11dc-47bf-958d-716cbf168ed8', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1386.812055] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Creating folder: Project (c07119c006e84a66bf7a37c1920f3694). Parent ref: group-v693691. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1386.812531] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9e5631c4-e989-4a41-9d18-382b6baa8cba {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1386.822952] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Created folder: Project (c07119c006e84a66bf7a37c1920f3694) in parent group-v693691. [ 1386.823194] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Creating folder: Instances. Parent ref: group-v693764. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1386.823323] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f62221ac-2e93-40b8-acec-96a706c85352 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1386.831177] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Created folder: Instances in parent group-v693764. [ 1386.831400] env[68617]: DEBUG oslo.service.loopingcall [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1386.831571] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1386.831756] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-eb0b9331-48e9-4ffe-afa9-18db9bcdefb8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1386.850016] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1386.850016] env[68617]: value = "task-3470816" [ 1386.850016] env[68617]: _type = "Task" [ 1386.850016] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1386.857247] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470816, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1387.359657] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470816, 'name': CreateVM_Task, 'duration_secs': 0.278308} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1387.359829] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1387.360507] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1387.360671] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1387.361080] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1387.361329] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-12d55d16-48dc-461a-99fa-9c5b3c1a28b1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1387.366053] env[68617]: DEBUG oslo_vmware.api [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Waiting for the task: (returnval){ [ 1387.366053] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]5214860a-f21b-fb8a-d173-9d30b4f72ba8" [ 1387.366053] env[68617]: _type = "Task" [ 1387.366053] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1387.373307] env[68617]: DEBUG oslo_vmware.api [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]5214860a-f21b-fb8a-d173-9d30b4f72ba8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1387.878443] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1387.878819] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1387.879148] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1388.512218] env[68617]: DEBUG nova.compute.manager [req-6b814e27-2974-4cd8-b044-451c66a0bbe5 req-782f48da-3dd7-4eff-b502-7ea0c76f9f75 service nova] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Received event network-changed-1dba5bab-11dc-47bf-958d-716cbf168ed8 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1388.512412] env[68617]: DEBUG nova.compute.manager [req-6b814e27-2974-4cd8-b044-451c66a0bbe5 req-782f48da-3dd7-4eff-b502-7ea0c76f9f75 service nova] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Refreshing instance network info cache due to event network-changed-1dba5bab-11dc-47bf-958d-716cbf168ed8. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1388.512633] env[68617]: DEBUG oslo_concurrency.lockutils [req-6b814e27-2974-4cd8-b044-451c66a0bbe5 req-782f48da-3dd7-4eff-b502-7ea0c76f9f75 service nova] Acquiring lock "refresh_cache-f03b9bc5-9438-4c0c-b595-72c631bece08" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1388.512775] env[68617]: DEBUG oslo_concurrency.lockutils [req-6b814e27-2974-4cd8-b044-451c66a0bbe5 req-782f48da-3dd7-4eff-b502-7ea0c76f9f75 service nova] Acquired lock "refresh_cache-f03b9bc5-9438-4c0c-b595-72c631bece08" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1388.512981] env[68617]: DEBUG nova.network.neutron [req-6b814e27-2974-4cd8-b044-451c66a0bbe5 req-782f48da-3dd7-4eff-b502-7ea0c76f9f75 service nova] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Refreshing network info cache for port 1dba5bab-11dc-47bf-958d-716cbf168ed8 {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1388.762348] env[68617]: DEBUG nova.network.neutron [req-6b814e27-2974-4cd8-b044-451c66a0bbe5 req-782f48da-3dd7-4eff-b502-7ea0c76f9f75 service nova] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Updated VIF entry in instance network info cache for port 1dba5bab-11dc-47bf-958d-716cbf168ed8. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1388.762715] env[68617]: DEBUG nova.network.neutron [req-6b814e27-2974-4cd8-b044-451c66a0bbe5 req-782f48da-3dd7-4eff-b502-7ea0c76f9f75 service nova] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Updating instance_info_cache with network_info: [{"id": "1dba5bab-11dc-47bf-958d-716cbf168ed8", "address": "fa:16:3e:6a:6e:c1", "network": {"id": "65bec07e-2fec-40ce-ac24-d75d61493fed", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-174756301-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c07119c006e84a66bf7a37c1920f3694", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "79ece966-6187-47d7-bce7-cc39df14ac67", "external-id": "nsx-vlan-transportzone-472", "segmentation_id": 472, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1dba5bab-11", "ovs_interfaceid": "1dba5bab-11dc-47bf-958d-716cbf168ed8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1388.775587] env[68617]: DEBUG oslo_concurrency.lockutils [req-6b814e27-2974-4cd8-b044-451c66a0bbe5 req-782f48da-3dd7-4eff-b502-7ea0c76f9f75 service nova] Releasing lock "refresh_cache-f03b9bc5-9438-4c0c-b595-72c631bece08" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1392.272189] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0f4f2eab-6e03-451a-b883-309c60868205 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Acquiring lock "f03b9bc5-9438-4c0c-b595-72c631bece08" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1426.699548] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1426.699873] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Starting heal instance info cache {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1426.699873] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Rebuilding the list of instances to heal {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1426.723895] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1426.724066] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1426.724201] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1426.724328] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1426.724453] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1426.724574] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1426.724693] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1426.724812] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1426.724949] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1426.725120] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1426.725250] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Didn't find any instances for network info cache update. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1429.699120] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1429.699396] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1430.699249] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1432.374976] env[68617]: WARNING oslo_vmware.rw_handles [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1432.374976] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1432.374976] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1432.374976] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1432.374976] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1432.374976] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 1432.374976] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1432.374976] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1432.374976] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1432.374976] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1432.374976] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1432.374976] env[68617]: ERROR oslo_vmware.rw_handles [ 1432.374976] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/df9882af-be91-4544-a0d9-72d64e57d0c6/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1432.377225] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1432.377486] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Copying Virtual Disk [datastore2] vmware_temp/df9882af-be91-4544-a0d9-72d64e57d0c6/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/df9882af-be91-4544-a0d9-72d64e57d0c6/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1432.377774] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-727057b6-a71b-403a-a857-5065e13622c6 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1432.386145] env[68617]: DEBUG oslo_vmware.api [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Waiting for the task: (returnval){ [ 1432.386145] env[68617]: value = "task-3470817" [ 1432.386145] env[68617]: _type = "Task" [ 1432.386145] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1432.394132] env[68617]: DEBUG oslo_vmware.api [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Task: {'id': task-3470817, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1432.699087] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1432.896106] env[68617]: DEBUG oslo_vmware.exceptions [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1432.896400] env[68617]: DEBUG oslo_concurrency.lockutils [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1432.896952] env[68617]: ERROR nova.compute.manager [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1432.896952] env[68617]: Faults: ['InvalidArgument'] [ 1432.896952] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Traceback (most recent call last): [ 1432.896952] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1432.896952] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] yield resources [ 1432.896952] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1432.896952] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] self.driver.spawn(context, instance, image_meta, [ 1432.896952] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1432.896952] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1432.896952] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1432.896952] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] self._fetch_image_if_missing(context, vi) [ 1432.896952] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1432.897386] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] image_cache(vi, tmp_image_ds_loc) [ 1432.897386] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1432.897386] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] vm_util.copy_virtual_disk( [ 1432.897386] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1432.897386] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] session._wait_for_task(vmdk_copy_task) [ 1432.897386] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1432.897386] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] return self.wait_for_task(task_ref) [ 1432.897386] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1432.897386] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] return evt.wait() [ 1432.897386] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1432.897386] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] result = hub.switch() [ 1432.897386] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1432.897386] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] return self.greenlet.switch() [ 1432.897690] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1432.897690] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] self.f(*self.args, **self.kw) [ 1432.897690] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1432.897690] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] raise exceptions.translate_fault(task_info.error) [ 1432.897690] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1432.897690] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Faults: ['InvalidArgument'] [ 1432.897690] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] [ 1432.897690] env[68617]: INFO nova.compute.manager [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Terminating instance [ 1432.898773] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1432.898985] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1432.899306] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d641bee1-fc0b-457f-8b98-5202ddd98e99 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1432.901666] env[68617]: DEBUG nova.compute.manager [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1432.901851] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1432.902582] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7be1f5c-62e0-4674-bfbd-c2cc90075396 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1432.909037] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1432.909244] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1de98421-3cd7-40bc-9830-12d520297f9b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1432.911353] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1432.911529] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1432.912453] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9fd1876d-4205-4704-b33c-c1c5b2c3bf11 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1432.917226] env[68617]: DEBUG oslo_vmware.api [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Waiting for the task: (returnval){ [ 1432.917226] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52e88276-31e9-9519-5e6a-517895b835ee" [ 1432.917226] env[68617]: _type = "Task" [ 1432.917226] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1432.924038] env[68617]: DEBUG oslo_vmware.api [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52e88276-31e9-9519-5e6a-517895b835ee, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1432.977098] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1432.977381] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1432.977579] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Deleting the datastore file [datastore2] 995585f5-57a4-4ba6-9e28-18a086af264c {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1432.977868] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-99197e4b-b8a3-4ba0-ad1e-da7c2f7c0259 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1432.984667] env[68617]: DEBUG oslo_vmware.api [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Waiting for the task: (returnval){ [ 1432.984667] env[68617]: value = "task-3470819" [ 1432.984667] env[68617]: _type = "Task" [ 1432.984667] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1432.992661] env[68617]: DEBUG oslo_vmware.api [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Task: {'id': task-3470819, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1433.427995] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1433.430236] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Creating directory with path [datastore2] vmware_temp/ec4daf5e-765d-4390-af8f-e3724a1b7b49/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1433.430236] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f8bc955a-2a6b-4017-9b08-5fed4cb6c1ea {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.441601] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Created directory with path [datastore2] vmware_temp/ec4daf5e-765d-4390-af8f-e3724a1b7b49/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1433.442022] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Fetch image to [datastore2] vmware_temp/ec4daf5e-765d-4390-af8f-e3724a1b7b49/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1433.442022] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/ec4daf5e-765d-4390-af8f-e3724a1b7b49/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1433.442796] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17a6c98e-696b-4719-863d-1d93473dcb57 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.449784] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7bdca79-4dac-4316-a262-d032471de0c3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.458905] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b78e2eb-4bfc-463f-8954-f10f5a1db96e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.493824] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db8da445-56d4-4f0d-a2c8-1dc3ff9b05d1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.500855] env[68617]: DEBUG oslo_vmware.api [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Task: {'id': task-3470819, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075589} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1433.502477] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1433.502579] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1433.502757] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1433.502931] env[68617]: INFO nova.compute.manager [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1433.505101] env[68617]: DEBUG nova.compute.claims [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1433.505328] env[68617]: DEBUG oslo_concurrency.lockutils [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1433.505557] env[68617]: DEBUG oslo_concurrency.lockutils [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1433.508135] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4c80a700-b273-445c-b9b8-3f2236f5ce0b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.534703] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1433.698551] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1433.698727] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68617) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1433.743322] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1433.744941] env[68617]: ERROR nova.compute.manager [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image c87eab51-bc9a-44dc-8f0d-7ab73283e453. [ 1433.744941] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Traceback (most recent call last): [ 1433.744941] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1433.744941] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1433.744941] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1433.744941] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] result = getattr(controller, method)(*args, **kwargs) [ 1433.744941] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1433.744941] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] return self._get(image_id) [ 1433.744941] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1433.744941] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1433.744941] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1433.745307] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] resp, body = self.http_client.get(url, headers=header) [ 1433.745307] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1433.745307] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] return self.request(url, 'GET', **kwargs) [ 1433.745307] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1433.745307] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] return self._handle_response(resp) [ 1433.745307] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1433.745307] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] raise exc.from_response(resp, resp.content) [ 1433.745307] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1433.745307] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] [ 1433.745307] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] During handling of the above exception, another exception occurred: [ 1433.745307] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] [ 1433.745307] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Traceback (most recent call last): [ 1433.745685] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1433.745685] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] yield resources [ 1433.745685] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1433.745685] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] self.driver.spawn(context, instance, image_meta, [ 1433.745685] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1433.745685] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1433.745685] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1433.745685] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] self._fetch_image_if_missing(context, vi) [ 1433.745685] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1433.745685] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] image_fetch(context, vi, tmp_image_ds_loc) [ 1433.745685] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1433.745685] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] images.fetch_image( [ 1433.745685] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1433.746072] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] metadata = IMAGE_API.get(context, image_ref) [ 1433.746072] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1433.746072] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] return session.show(context, image_id, [ 1433.746072] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1433.746072] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] _reraise_translated_image_exception(image_id) [ 1433.746072] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1433.746072] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] raise new_exc.with_traceback(exc_trace) [ 1433.746072] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1433.746072] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1433.746072] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1433.746072] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] result = getattr(controller, method)(*args, **kwargs) [ 1433.746072] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1433.746072] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] return self._get(image_id) [ 1433.746423] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1433.746423] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1433.746423] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1433.746423] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] resp, body = self.http_client.get(url, headers=header) [ 1433.746423] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1433.746423] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] return self.request(url, 'GET', **kwargs) [ 1433.746423] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1433.746423] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] return self._handle_response(resp) [ 1433.746423] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1433.746423] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] raise exc.from_response(resp, resp.content) [ 1433.746423] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] nova.exception.ImageNotAuthorized: Not authorized for image c87eab51-bc9a-44dc-8f0d-7ab73283e453. [ 1433.746423] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] [ 1433.746745] env[68617]: INFO nova.compute.manager [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Terminating instance [ 1433.746865] env[68617]: DEBUG oslo_concurrency.lockutils [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1433.747091] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1433.747619] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Acquiring lock "refresh_cache-82864ac3-a199-478c-8c57-97ea0a256201" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1433.747768] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Acquired lock "refresh_cache-82864ac3-a199-478c-8c57-97ea0a256201" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1433.747930] env[68617]: DEBUG nova.network.neutron [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1433.748837] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3812945f-4535-43b0-aca7-8527a809aace {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.761044] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1433.761044] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1433.761873] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e34e6d34-c9cf-4f2b-a8a7-dad12fd8d160 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.769738] env[68617]: DEBUG oslo_vmware.api [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Waiting for the task: (returnval){ [ 1433.769738] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52bb4a0b-ad2f-181d-6ffd-7fc74780946b" [ 1433.769738] env[68617]: _type = "Task" [ 1433.769738] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1433.778146] env[68617]: DEBUG oslo_vmware.api [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52bb4a0b-ad2f-181d-6ffd-7fc74780946b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1433.779039] env[68617]: DEBUG nova.network.neutron [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1433.845261] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b329620-8709-4a63-87b0-31a6c58d6c3b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.853508] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f42c476-59e9-4113-b6af-21a16f1ed368 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.884388] env[68617]: DEBUG nova.network.neutron [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1433.886237] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b02776d-6869-411e-b88b-737d9a0ea7ea {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.894744] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a664cfd-24fe-436b-b959-a5c6b0724e58 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.901024] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Releasing lock "refresh_cache-82864ac3-a199-478c-8c57-97ea0a256201" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1433.901485] env[68617]: DEBUG nova.compute.manager [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1433.901696] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1433.903230] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f878415-0c26-47d2-a51c-8050cdcd6866 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.914991] env[68617]: DEBUG nova.compute.provider_tree [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1433.921952] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1433.921952] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-15bf72be-ac01-4fb6-9bcd-0a0d088e5f0c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.929203] env[68617]: DEBUG nova.scheduler.client.report [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1433.948033] env[68617]: DEBUG oslo_concurrency.lockutils [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.442s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1433.948627] env[68617]: ERROR nova.compute.manager [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1433.948627] env[68617]: Faults: ['InvalidArgument'] [ 1433.948627] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Traceback (most recent call last): [ 1433.948627] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1433.948627] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] self.driver.spawn(context, instance, image_meta, [ 1433.948627] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1433.948627] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1433.948627] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1433.948627] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] self._fetch_image_if_missing(context, vi) [ 1433.948627] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1433.948627] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] image_cache(vi, tmp_image_ds_loc) [ 1433.948627] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1433.949106] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] vm_util.copy_virtual_disk( [ 1433.949106] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1433.949106] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] session._wait_for_task(vmdk_copy_task) [ 1433.949106] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1433.949106] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] return self.wait_for_task(task_ref) [ 1433.949106] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1433.949106] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] return evt.wait() [ 1433.949106] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1433.949106] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] result = hub.switch() [ 1433.949106] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1433.949106] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] return self.greenlet.switch() [ 1433.949106] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1433.949106] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] self.f(*self.args, **self.kw) [ 1433.949698] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1433.949698] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] raise exceptions.translate_fault(task_info.error) [ 1433.949698] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1433.949698] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Faults: ['InvalidArgument'] [ 1433.949698] env[68617]: ERROR nova.compute.manager [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] [ 1433.949698] env[68617]: DEBUG nova.compute.utils [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1433.951862] env[68617]: DEBUG nova.compute.manager [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Build of instance 995585f5-57a4-4ba6-9e28-18a086af264c was re-scheduled: A specified parameter was not correct: fileType [ 1433.951862] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1433.952304] env[68617]: DEBUG nova.compute.manager [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1433.952470] env[68617]: DEBUG nova.compute.manager [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1433.952638] env[68617]: DEBUG nova.compute.manager [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1433.952800] env[68617]: DEBUG nova.network.neutron [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1433.957820] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1433.958046] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1433.958232] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Deleting the datastore file [datastore2] 82864ac3-a199-478c-8c57-97ea0a256201 {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1433.958506] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e80f07c8-57ec-448e-86b4-2a4b2fdfe878 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.966036] env[68617]: DEBUG oslo_vmware.api [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Waiting for the task: (returnval){ [ 1433.966036] env[68617]: value = "task-3470821" [ 1433.966036] env[68617]: _type = "Task" [ 1433.966036] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1433.973781] env[68617]: DEBUG oslo_vmware.api [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Task: {'id': task-3470821, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1434.286580] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1434.287055] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Creating directory with path [datastore2] vmware_temp/48ace648-4811-4e5e-8fc1-ab74e11dd2c7/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1434.287481] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8bd07a73-efdf-4e7a-a4e8-e48ce7dd5bf3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1434.300019] env[68617]: DEBUG nova.network.neutron [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1434.302584] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Created directory with path [datastore2] vmware_temp/48ace648-4811-4e5e-8fc1-ab74e11dd2c7/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1434.302787] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Fetch image to [datastore2] vmware_temp/48ace648-4811-4e5e-8fc1-ab74e11dd2c7/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1434.302958] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/48ace648-4811-4e5e-8fc1-ab74e11dd2c7/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1434.303977] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78d0e0fe-4260-4567-8e50-5b547de29a0e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1434.312146] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d129df9-5b03-4061-aae1-d71f5d4d7e85 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1434.316211] env[68617]: INFO nova.compute.manager [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Took 0.36 seconds to deallocate network for instance. [ 1434.326861] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-778ffa40-0bff-4112-b2b6-24640091606d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1434.362616] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-918befc9-3bb7-42a8-ac3a-086b408b00ed {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1434.372758] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c196fa5f-48f3-4ddd-a220-ad6f72f991d0 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1435.182373] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1435.194869] env[68617]: DEBUG oslo_vmware.api [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Task: {'id': task-3470821, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.041048} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1435.195133] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1435.195315] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1435.195482] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1435.195644] env[68617]: INFO nova.compute.manager [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Took 1.29 seconds to destroy the instance on the hypervisor. [ 1435.195876] env[68617]: DEBUG oslo.service.loopingcall [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1435.197077] env[68617]: DEBUG nova.compute.manager [-] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Skipping network deallocation for instance since networking was not requested. {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1435.198251] env[68617]: DEBUG nova.compute.claims [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1435.198421] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1435.198633] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1435.217164] env[68617]: INFO nova.scheduler.client.report [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Deleted allocations for instance 995585f5-57a4-4ba6-9e28-18a086af264c [ 1435.244204] env[68617]: DEBUG oslo_vmware.rw_handles [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/48ace648-4811-4e5e-8fc1-ab74e11dd2c7/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1435.245800] env[68617]: DEBUG oslo_concurrency.lockutils [None req-eaa786f3-b325-471d-a962-f0d35c9d4130 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "995585f5-57a4-4ba6-9e28-18a086af264c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 686.932s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1435.300667] env[68617]: DEBUG oslo_concurrency.lockutils [None req-07b182b3-189d-4235-affd-8500e6424c2c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "995585f5-57a4-4ba6-9e28-18a086af264c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 488.463s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1435.300889] env[68617]: DEBUG oslo_concurrency.lockutils [None req-07b182b3-189d-4235-affd-8500e6424c2c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "995585f5-57a4-4ba6-9e28-18a086af264c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1435.301101] env[68617]: DEBUG oslo_concurrency.lockutils [None req-07b182b3-189d-4235-affd-8500e6424c2c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "995585f5-57a4-4ba6-9e28-18a086af264c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1435.301265] env[68617]: DEBUG oslo_concurrency.lockutils [None req-07b182b3-189d-4235-affd-8500e6424c2c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "995585f5-57a4-4ba6-9e28-18a086af264c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1435.303565] env[68617]: INFO nova.compute.manager [None req-07b182b3-189d-4235-affd-8500e6424c2c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Terminating instance [ 1435.309426] env[68617]: DEBUG oslo_concurrency.lockutils [None req-07b182b3-189d-4235-affd-8500e6424c2c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "refresh_cache-995585f5-57a4-4ba6-9e28-18a086af264c" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1435.309585] env[68617]: DEBUG oslo_concurrency.lockutils [None req-07b182b3-189d-4235-affd-8500e6424c2c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquired lock "refresh_cache-995585f5-57a4-4ba6-9e28-18a086af264c" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1435.309748] env[68617]: DEBUG nova.network.neutron [None req-07b182b3-189d-4235-affd-8500e6424c2c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1435.310836] env[68617]: DEBUG oslo_vmware.rw_handles [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1435.311115] env[68617]: DEBUG oslo_vmware.rw_handles [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/48ace648-4811-4e5e-8fc1-ab74e11dd2c7/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1435.311658] env[68617]: DEBUG nova.compute.manager [None req-a3936b72-dd94-4e40-a012-ddb78915f308 tempest-ServerGroupTestJSON-1648189536 tempest-ServerGroupTestJSON-1648189536-project-member] [instance: 570302ee-2383-4659-80e1-af4b16d03a21] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1435.334945] env[68617]: DEBUG nova.compute.manager [None req-a3936b72-dd94-4e40-a012-ddb78915f308 tempest-ServerGroupTestJSON-1648189536 tempest-ServerGroupTestJSON-1648189536-project-member] [instance: 570302ee-2383-4659-80e1-af4b16d03a21] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1435.345241] env[68617]: DEBUG nova.network.neutron [None req-07b182b3-189d-4235-affd-8500e6424c2c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1435.358606] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a3936b72-dd94-4e40-a012-ddb78915f308 tempest-ServerGroupTestJSON-1648189536 tempest-ServerGroupTestJSON-1648189536-project-member] Lock "570302ee-2383-4659-80e1-af4b16d03a21" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 233.897s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1435.371077] env[68617]: DEBUG nova.compute.manager [None req-7f0054fc-a131-44a2-aa99-cf76b1d25111 tempest-ServerMetadataTestJSON-2102659189 tempest-ServerMetadataTestJSON-2102659189-project-member] [instance: 7bf75617-fcd8-4d96-bf02-ddb723e8ad96] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1435.396887] env[68617]: DEBUG nova.compute.manager [None req-7f0054fc-a131-44a2-aa99-cf76b1d25111 tempest-ServerMetadataTestJSON-2102659189 tempest-ServerMetadataTestJSON-2102659189-project-member] [instance: 7bf75617-fcd8-4d96-bf02-ddb723e8ad96] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1435.422231] env[68617]: DEBUG oslo_concurrency.lockutils [None req-7f0054fc-a131-44a2-aa99-cf76b1d25111 tempest-ServerMetadataTestJSON-2102659189 tempest-ServerMetadataTestJSON-2102659189-project-member] Lock "7bf75617-fcd8-4d96-bf02-ddb723e8ad96" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 228.768s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1435.432106] env[68617]: DEBUG nova.compute.manager [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1435.463839] env[68617]: DEBUG nova.network.neutron [None req-07b182b3-189d-4235-affd-8500e6424c2c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1435.473445] env[68617]: DEBUG oslo_concurrency.lockutils [None req-07b182b3-189d-4235-affd-8500e6424c2c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Releasing lock "refresh_cache-995585f5-57a4-4ba6-9e28-18a086af264c" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1435.473794] env[68617]: DEBUG nova.compute.manager [None req-07b182b3-189d-4235-affd-8500e6424c2c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1435.473980] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-07b182b3-189d-4235-affd-8500e6424c2c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1435.474464] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-9d883b88-74d2-48d5-a516-fe0400b26e1d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1435.482679] env[68617]: DEBUG oslo_concurrency.lockutils [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1435.486588] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dae1524f-1486-4eb9-a136-077511da60d3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1435.517020] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-07b182b3-189d-4235-affd-8500e6424c2c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 995585f5-57a4-4ba6-9e28-18a086af264c could not be found. [ 1435.517106] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-07b182b3-189d-4235-affd-8500e6424c2c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1435.517261] env[68617]: INFO nova.compute.manager [None req-07b182b3-189d-4235-affd-8500e6424c2c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1435.517524] env[68617]: DEBUG oslo.service.loopingcall [None req-07b182b3-189d-4235-affd-8500e6424c2c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1435.517744] env[68617]: DEBUG nova.compute.manager [-] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1435.517842] env[68617]: DEBUG nova.network.neutron [-] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1435.535098] env[68617]: DEBUG nova.network.neutron [-] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1435.543008] env[68617]: DEBUG nova.network.neutron [-] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1435.551895] env[68617]: INFO nova.compute.manager [-] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] Took 0.03 seconds to deallocate network for instance. [ 1435.586572] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-092c8fe0-c65a-416e-882c-fd69eba81478 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1435.595289] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ccdb77e-be07-4cdf-a291-1f634a625568 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1435.629752] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac8ad599-c61b-442f-ad9a-95c3dc3f5663 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1435.637233] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-136b7144-4fca-46f0-8842-3787b83ba29b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1435.652171] env[68617]: DEBUG nova.compute.provider_tree [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1435.655539] env[68617]: DEBUG oslo_concurrency.lockutils [None req-07b182b3-189d-4235-affd-8500e6424c2c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "995585f5-57a4-4ba6-9e28-18a086af264c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.355s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1435.656305] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "995585f5-57a4-4ba6-9e28-18a086af264c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 146.116s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1435.656487] env[68617]: INFO nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 995585f5-57a4-4ba6-9e28-18a086af264c] During sync_power_state the instance has a pending task (deleting). Skip. [ 1435.656858] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "995585f5-57a4-4ba6-9e28-18a086af264c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1435.660676] env[68617]: DEBUG nova.scheduler.client.report [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1435.672710] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.474s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1435.673451] env[68617]: ERROR nova.compute.manager [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Failed to build and run instance: nova.exception.ImageNotAuthorized: Not authorized for image c87eab51-bc9a-44dc-8f0d-7ab73283e453. [ 1435.673451] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Traceback (most recent call last): [ 1435.673451] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1435.673451] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1435.673451] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1435.673451] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] result = getattr(controller, method)(*args, **kwargs) [ 1435.673451] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1435.673451] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] return self._get(image_id) [ 1435.673451] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1435.673451] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1435.673451] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1435.673743] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] resp, body = self.http_client.get(url, headers=header) [ 1435.673743] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1435.673743] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] return self.request(url, 'GET', **kwargs) [ 1435.673743] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1435.673743] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] return self._handle_response(resp) [ 1435.673743] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1435.673743] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] raise exc.from_response(resp, resp.content) [ 1435.673743] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1435.673743] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] [ 1435.673743] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] During handling of the above exception, another exception occurred: [ 1435.673743] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] [ 1435.673743] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Traceback (most recent call last): [ 1435.674305] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1435.674305] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] self.driver.spawn(context, instance, image_meta, [ 1435.674305] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1435.674305] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1435.674305] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1435.674305] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] self._fetch_image_if_missing(context, vi) [ 1435.674305] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1435.674305] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] image_fetch(context, vi, tmp_image_ds_loc) [ 1435.674305] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1435.674305] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] images.fetch_image( [ 1435.674305] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1435.674305] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] metadata = IMAGE_API.get(context, image_ref) [ 1435.674305] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1435.674720] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] return session.show(context, image_id, [ 1435.674720] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1435.674720] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] _reraise_translated_image_exception(image_id) [ 1435.674720] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1435.674720] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] raise new_exc.with_traceback(exc_trace) [ 1435.674720] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1435.674720] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1435.674720] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1435.674720] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] result = getattr(controller, method)(*args, **kwargs) [ 1435.674720] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1435.674720] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] return self._get(image_id) [ 1435.674720] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1435.674720] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1435.675012] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1435.675012] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] resp, body = self.http_client.get(url, headers=header) [ 1435.675012] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1435.675012] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] return self.request(url, 'GET', **kwargs) [ 1435.675012] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1435.675012] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] return self._handle_response(resp) [ 1435.675012] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1435.675012] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] raise exc.from_response(resp, resp.content) [ 1435.675012] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] nova.exception.ImageNotAuthorized: Not authorized for image c87eab51-bc9a-44dc-8f0d-7ab73283e453. [ 1435.675012] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] [ 1435.675012] env[68617]: DEBUG nova.compute.utils [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Not authorized for image c87eab51-bc9a-44dc-8f0d-7ab73283e453. {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1435.675288] env[68617]: DEBUG oslo_concurrency.lockutils [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.192s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1435.676553] env[68617]: INFO nova.compute.claims [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1435.679187] env[68617]: DEBUG nova.compute.manager [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Build of instance 82864ac3-a199-478c-8c57-97ea0a256201 was re-scheduled: Not authorized for image c87eab51-bc9a-44dc-8f0d-7ab73283e453. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1435.679648] env[68617]: DEBUG nova.compute.manager [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1435.679870] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Acquiring lock "refresh_cache-82864ac3-a199-478c-8c57-97ea0a256201" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1435.680082] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Acquired lock "refresh_cache-82864ac3-a199-478c-8c57-97ea0a256201" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1435.680187] env[68617]: DEBUG nova.network.neutron [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1435.699755] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1435.703518] env[68617]: DEBUG nova.network.neutron [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1435.778105] env[68617]: DEBUG nova.network.neutron [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1435.787232] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Releasing lock "refresh_cache-82864ac3-a199-478c-8c57-97ea0a256201" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1435.787492] env[68617]: DEBUG nova.compute.manager [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1435.787707] env[68617]: DEBUG nova.compute.manager [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Skipping network deallocation for instance since networking was not requested. {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1435.873800] env[68617]: INFO nova.scheduler.client.report [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Deleted allocations for instance 82864ac3-a199-478c-8c57-97ea0a256201 [ 1435.890559] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2eef3b4e-5ff4-45af-999b-5581e4a23b19 tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Lock "82864ac3-a199-478c-8c57-97ea0a256201" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 641.838s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1435.893907] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0fadadf0-5065-4728-8fe2-813e0fbea4cf tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Lock "82864ac3-a199-478c-8c57-97ea0a256201" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 444.987s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1435.894188] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0fadadf0-5065-4728-8fe2-813e0fbea4cf tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Acquiring lock "82864ac3-a199-478c-8c57-97ea0a256201-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1435.894328] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0fadadf0-5065-4728-8fe2-813e0fbea4cf tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Lock "82864ac3-a199-478c-8c57-97ea0a256201-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1435.894517] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0fadadf0-5065-4728-8fe2-813e0fbea4cf tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Lock "82864ac3-a199-478c-8c57-97ea0a256201-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1435.896384] env[68617]: INFO nova.compute.manager [None req-0fadadf0-5065-4728-8fe2-813e0fbea4cf tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Terminating instance [ 1435.898055] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0fadadf0-5065-4728-8fe2-813e0fbea4cf tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Acquiring lock "refresh_cache-82864ac3-a199-478c-8c57-97ea0a256201" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1435.898217] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0fadadf0-5065-4728-8fe2-813e0fbea4cf tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Acquired lock "refresh_cache-82864ac3-a199-478c-8c57-97ea0a256201" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1435.898500] env[68617]: DEBUG nova.network.neutron [None req-0fadadf0-5065-4728-8fe2-813e0fbea4cf tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1435.910905] env[68617]: DEBUG nova.compute.manager [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1435.930569] env[68617]: DEBUG nova.network.neutron [None req-0fadadf0-5065-4728-8fe2-813e0fbea4cf tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1435.967267] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-775c1875-142b-4b4b-8611-ea09ba7a9a10 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1435.970895] env[68617]: DEBUG oslo_concurrency.lockutils [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1435.976491] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11ca147d-a6cf-4be5-baed-26906f45a296 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1436.006223] env[68617]: DEBUG nova.network.neutron [None req-0fadadf0-5065-4728-8fe2-813e0fbea4cf tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1436.007699] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd3e56ce-b6fc-4e26-b93b-b7af8f4e4c82 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1436.014787] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d3ff26d-2c9d-4e6e-a6c8-1d376cce15cf {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1436.020042] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0fadadf0-5065-4728-8fe2-813e0fbea4cf tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Releasing lock "refresh_cache-82864ac3-a199-478c-8c57-97ea0a256201" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1436.020042] env[68617]: DEBUG nova.compute.manager [None req-0fadadf0-5065-4728-8fe2-813e0fbea4cf tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1436.020042] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0fadadf0-5065-4728-8fe2-813e0fbea4cf tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1436.020638] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b6688ae8-b8dd-4736-a9e0-c25473c7cd9c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1436.030082] env[68617]: DEBUG nova.compute.provider_tree [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1436.036953] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9297c908-4448-466f-8eac-b3ffe06d6680 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1436.047716] env[68617]: DEBUG nova.scheduler.client.report [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1436.067765] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-0fadadf0-5065-4728-8fe2-813e0fbea4cf tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 82864ac3-a199-478c-8c57-97ea0a256201 could not be found. [ 1436.067952] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0fadadf0-5065-4728-8fe2-813e0fbea4cf tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1436.068141] env[68617]: INFO nova.compute.manager [None req-0fadadf0-5065-4728-8fe2-813e0fbea4cf tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1436.068374] env[68617]: DEBUG oslo.service.loopingcall [None req-0fadadf0-5065-4728-8fe2-813e0fbea4cf tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1436.068954] env[68617]: DEBUG oslo_concurrency.lockutils [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.394s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1436.069409] env[68617]: DEBUG nova.compute.manager [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1436.071717] env[68617]: DEBUG nova.compute.manager [-] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1436.071819] env[68617]: DEBUG nova.network.neutron [-] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1436.073763] env[68617]: DEBUG oslo_concurrency.lockutils [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.103s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1436.075150] env[68617]: INFO nova.compute.claims [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1436.105583] env[68617]: DEBUG nova.compute.utils [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1436.106411] env[68617]: DEBUG nova.compute.manager [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1436.106411] env[68617]: DEBUG nova.network.neutron [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1436.117248] env[68617]: DEBUG nova.compute.manager [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1436.172148] env[68617]: DEBUG nova.policy [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '31ec69b08f16459d87a4c4fa7a528b25', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3501bf23a29d45c1b8b06297c58c4439', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 1436.194807] env[68617]: DEBUG nova.compute.manager [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1436.256273] env[68617]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=68617) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1436.256536] env[68617]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1436.257061] env[68617]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1436.257061] env[68617]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1436.257061] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1436.257061] env[68617]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1436.257061] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1436.257061] env[68617]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1436.257061] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1436.257061] env[68617]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1436.257061] env[68617]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1436.257061] env[68617]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-5f9f8f5f-35fb-4348-999d-bc9c162e5c2b'] [ 1436.257061] env[68617]: ERROR oslo.service.loopingcall [ 1436.257061] env[68617]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1436.257061] env[68617]: ERROR oslo.service.loopingcall [ 1436.257061] env[68617]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1436.257061] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1436.257061] env[68617]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1436.257470] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1436.257470] env[68617]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1436.257470] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1436.257470] env[68617]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1436.257470] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1436.257470] env[68617]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1436.257470] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1436.257470] env[68617]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1436.257470] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1436.257470] env[68617]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1436.257470] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1436.257470] env[68617]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1436.257470] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1436.257470] env[68617]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1436.257470] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1436.257470] env[68617]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1436.257470] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1436.257470] env[68617]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1436.257942] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1436.257942] env[68617]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1436.257942] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1436.257942] env[68617]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1436.257942] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1436.257942] env[68617]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1436.257942] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1436.257942] env[68617]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1436.257942] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1436.257942] env[68617]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1436.257942] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1436.257942] env[68617]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1436.257942] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1436.257942] env[68617]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1436.257942] env[68617]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1436.257942] env[68617]: ERROR oslo.service.loopingcall [ 1436.258309] env[68617]: ERROR nova.compute.manager [None req-0fadadf0-5065-4728-8fe2-813e0fbea4cf tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1436.274240] env[68617]: DEBUG nova.virt.hardware [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1436.274491] env[68617]: DEBUG nova.virt.hardware [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1436.274652] env[68617]: DEBUG nova.virt.hardware [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1436.274826] env[68617]: DEBUG nova.virt.hardware [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1436.274967] env[68617]: DEBUG nova.virt.hardware [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1436.275199] env[68617]: DEBUG nova.virt.hardware [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1436.275667] env[68617]: DEBUG nova.virt.hardware [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1436.275667] env[68617]: DEBUG nova.virt.hardware [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1436.276182] env[68617]: DEBUG nova.virt.hardware [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1436.276182] env[68617]: DEBUG nova.virt.hardware [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1436.276182] env[68617]: DEBUG nova.virt.hardware [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1436.276983] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd297cca-fa5d-42f7-9aad-ec1fe599e8e4 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1436.287500] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13a42f03-4486-45be-a611-730011c9507d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1436.292974] env[68617]: ERROR nova.compute.manager [None req-0fadadf0-5065-4728-8fe2-813e0fbea4cf tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1436.292974] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Traceback (most recent call last): [ 1436.292974] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1436.292974] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] ret = obj(*args, **kwargs) [ 1436.292974] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1436.292974] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] exception_handler_v20(status_code, error_body) [ 1436.292974] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1436.292974] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] raise client_exc(message=error_message, [ 1436.292974] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1436.292974] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Neutron server returns request_ids: ['req-5f9f8f5f-35fb-4348-999d-bc9c162e5c2b'] [ 1436.292974] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] [ 1436.293318] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] During handling of the above exception, another exception occurred: [ 1436.293318] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] [ 1436.293318] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Traceback (most recent call last): [ 1436.293318] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1436.293318] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] self._delete_instance(context, instance, bdms) [ 1436.293318] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1436.293318] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] self._shutdown_instance(context, instance, bdms) [ 1436.293318] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1436.293318] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] self._try_deallocate_network(context, instance, requested_networks) [ 1436.293318] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1436.293318] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] with excutils.save_and_reraise_exception(): [ 1436.293318] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1436.293318] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] self.force_reraise() [ 1436.293660] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1436.293660] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] raise self.value [ 1436.293660] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1436.293660] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] _deallocate_network_with_retries() [ 1436.293660] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1436.293660] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] return evt.wait() [ 1436.293660] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1436.293660] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] result = hub.switch() [ 1436.293660] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1436.293660] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] return self.greenlet.switch() [ 1436.293660] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1436.293660] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] result = func(*self.args, **self.kw) [ 1436.293945] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1436.293945] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] result = f(*args, **kwargs) [ 1436.293945] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1436.293945] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] self._deallocate_network( [ 1436.293945] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1436.293945] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] self.network_api.deallocate_for_instance( [ 1436.293945] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1436.293945] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] data = neutron.list_ports(**search_opts) [ 1436.293945] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1436.293945] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] ret = obj(*args, **kwargs) [ 1436.293945] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1436.293945] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] return self.list('ports', self.ports_path, retrieve_all, [ 1436.293945] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1436.294254] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] ret = obj(*args, **kwargs) [ 1436.294254] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1436.294254] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] for r in self._pagination(collection, path, **params): [ 1436.294254] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1436.294254] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] res = self.get(path, params=params) [ 1436.294254] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1436.294254] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] ret = obj(*args, **kwargs) [ 1436.294254] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1436.294254] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] return self.retry_request("GET", action, body=body, [ 1436.294254] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1436.294254] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] ret = obj(*args, **kwargs) [ 1436.294254] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1436.294254] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] return self.do_request(method, action, body=body, [ 1436.294554] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1436.294554] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] ret = obj(*args, **kwargs) [ 1436.294554] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1436.294554] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] self._handle_fault_response(status_code, replybody, resp) [ 1436.294554] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1436.294554] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1436.294554] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1436.294554] env[68617]: ERROR nova.compute.manager [instance: 82864ac3-a199-478c-8c57-97ea0a256201] [ 1436.326644] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0fadadf0-5065-4728-8fe2-813e0fbea4cf tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Lock "82864ac3-a199-478c-8c57-97ea0a256201" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.433s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1436.329102] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "82864ac3-a199-478c-8c57-97ea0a256201" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 146.787s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1436.329102] env[68617]: INFO nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] During sync_power_state the instance has a pending task (deleting). Skip. [ 1436.329102] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "82864ac3-a199-478c-8c57-97ea0a256201" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1436.394119] env[68617]: INFO nova.compute.manager [None req-0fadadf0-5065-4728-8fe2-813e0fbea4cf tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] [instance: 82864ac3-a199-478c-8c57-97ea0a256201] Successfully reverted task state from None on failure for instance. [ 1436.397741] env[68617]: ERROR oslo_messaging.rpc.server [None req-0fadadf0-5065-4728-8fe2-813e0fbea4cf tempest-ServerShowV257Test-583112352 tempest-ServerShowV257Test-583112352-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1436.397741] env[68617]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1436.397741] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1436.397741] env[68617]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1436.397741] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1436.397741] env[68617]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1436.397741] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1436.397741] env[68617]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1436.397741] env[68617]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1436.397741] env[68617]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-5f9f8f5f-35fb-4348-999d-bc9c162e5c2b'] [ 1436.397741] env[68617]: ERROR oslo_messaging.rpc.server [ 1436.397741] env[68617]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1436.397741] env[68617]: ERROR oslo_messaging.rpc.server [ 1436.397741] env[68617]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1436.397741] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1436.397741] env[68617]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1436.398362] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1436.398362] env[68617]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1436.398362] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1436.398362] env[68617]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1436.398362] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1436.398362] env[68617]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1436.398362] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1436.398362] env[68617]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1436.398362] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1436.398362] env[68617]: ERROR oslo_messaging.rpc.server raise self.value [ 1436.398362] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1436.398362] env[68617]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1436.398362] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1436.398362] env[68617]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1436.398362] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1436.398362] env[68617]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1436.398362] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1436.398362] env[68617]: ERROR oslo_messaging.rpc.server raise self.value [ 1436.398894] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1436.398894] env[68617]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1436.398894] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1436.398894] env[68617]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1436.398894] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1436.398894] env[68617]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1436.398894] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1436.398894] env[68617]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1436.398894] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1436.398894] env[68617]: ERROR oslo_messaging.rpc.server raise self.value [ 1436.398894] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1436.398894] env[68617]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1436.398894] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1436.398894] env[68617]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1436.398894] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1436.398894] env[68617]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1436.398894] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1436.398894] env[68617]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1436.399288] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1436.399288] env[68617]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1436.399288] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1436.399288] env[68617]: ERROR oslo_messaging.rpc.server raise self.value [ 1436.399288] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1436.399288] env[68617]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1436.399288] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1436.399288] env[68617]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1436.399288] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1436.399288] env[68617]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1436.399288] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1436.399288] env[68617]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1436.399288] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1436.399288] env[68617]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1436.399288] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1436.399288] env[68617]: ERROR oslo_messaging.rpc.server raise self.value [ 1436.399288] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1436.399288] env[68617]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1436.399808] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1436.399808] env[68617]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1436.399808] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1436.399808] env[68617]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1436.399808] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1436.399808] env[68617]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1436.399808] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1436.399808] env[68617]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1436.399808] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1436.399808] env[68617]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1436.399808] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1436.399808] env[68617]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1436.399808] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1436.399808] env[68617]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1436.399808] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1436.399808] env[68617]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1436.399808] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1436.399808] env[68617]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1436.400231] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1436.400231] env[68617]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1436.400231] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1436.400231] env[68617]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1436.400231] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1436.400231] env[68617]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1436.400231] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1436.400231] env[68617]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1436.400231] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1436.400231] env[68617]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1436.400231] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1436.400231] env[68617]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1436.400231] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1436.400231] env[68617]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1436.400231] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1436.400231] env[68617]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1436.400231] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1436.400231] env[68617]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1436.400729] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1436.400729] env[68617]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1436.400729] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1436.400729] env[68617]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1436.400729] env[68617]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1436.400729] env[68617]: ERROR oslo_messaging.rpc.server [ 1436.401354] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-437d9703-6f43-4ecc-89e1-5ad82532c6da {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1436.408878] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43895afb-1241-48b5-a532-9be8236c3211 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1436.440257] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9729e1ad-4459-4f18-96af-ce358ce25084 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1436.447868] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9aa95580-1937-4536-9025-e43c849d98cd {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1436.461096] env[68617]: DEBUG nova.compute.provider_tree [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1436.471230] env[68617]: DEBUG nova.scheduler.client.report [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1436.487284] env[68617]: DEBUG oslo_concurrency.lockutils [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.412s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1436.487284] env[68617]: DEBUG nova.compute.manager [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1436.516794] env[68617]: DEBUG nova.compute.utils [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1436.518208] env[68617]: DEBUG nova.compute.manager [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1436.518381] env[68617]: DEBUG nova.network.neutron [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1436.527313] env[68617]: DEBUG nova.compute.manager [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1436.596702] env[68617]: DEBUG nova.compute.manager [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1436.608734] env[68617]: DEBUG nova.network.neutron [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Successfully created port: 45d6edbe-72f6-4bd0-8566-1e1497a8dd58 {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1436.613438] env[68617]: DEBUG nova.policy [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'be1fb3906fa449949fc0b5eae9cab9fb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1e11c4e5c25a42119594647403c0199b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 1436.624227] env[68617]: DEBUG nova.virt.hardware [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1436.624517] env[68617]: DEBUG nova.virt.hardware [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1436.624713] env[68617]: DEBUG nova.virt.hardware [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1436.624939] env[68617]: DEBUG nova.virt.hardware [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1436.625082] env[68617]: DEBUG nova.virt.hardware [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1436.625249] env[68617]: DEBUG nova.virt.hardware [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1436.625728] env[68617]: DEBUG nova.virt.hardware [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1436.625728] env[68617]: DEBUG nova.virt.hardware [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1436.625834] env[68617]: DEBUG nova.virt.hardware [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1436.625957] env[68617]: DEBUG nova.virt.hardware [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1436.626163] env[68617]: DEBUG nova.virt.hardware [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1436.627569] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8fc4c19d-d437-4321-a5e7-5c91eb564f62 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1436.636334] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48c367fc-625f-4fcf-9bae-8da266729416 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1436.699154] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1436.712098] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1436.712294] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1436.712489] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1436.712609] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1436.713822] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdd6592c-dd58-463b-a074-cf71a451391b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1436.723494] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28c81ded-d237-4ac5-805e-4434466b5ca1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1436.737565] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35feae7f-fa45-4adf-9274-b84d7f887bc5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1436.744054] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7a228a3-d9ce-47d0-8254-74d7c3caca5a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1436.772620] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180920MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1436.772808] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1436.772951] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1436.852801] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1436.852998] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 79c92a1b-20ef-4360-93b4-913cbfcf92fe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1436.853247] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 1cc42c7f-8781-40b0-9f75-edfef3bc90e7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1436.853315] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance d46ca6f3-0ee9-412c-98b4-f639ce4f9228 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1436.853409] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance a8ff6232-530c-453a-96e4-f8ce00f976e3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1436.853553] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1436.853684] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance e90877a8-47d3-47d7-8362-5bcfe3a98c36 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1436.853805] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f03b9bc5-9438-4c0c-b595-72c631bece08 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1436.853922] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance ee6efd93-25be-4268-afe9-ba39e543a4fb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1436.854048] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 1605028f-5d6d-4ac4-8416-c0465982c53a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1436.866084] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance fc1043b8-535d-4af0-b92b-1f43580cdc9a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1436.879966] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1436.890851] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance a019d654-82ed-4ef2-850f-39a1f324566a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1436.904731] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 43495abf-8f99-4f51-81ca-80a43c266695 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1436.916077] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b9d0b85a-f0ac-4f9e-bec4-a82db0eb96c3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1436.930039] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 5d294d66-266f-4a0b-be49-5061fb65b226 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1436.940585] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f8c0a514-7e7f-455a-b84d-9afc2957945c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1436.952653] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 9ca297f6-3239-48d3-9b67-dd1637a3bc25 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1436.961655] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 57cdcf44-576a-4343-9277-4b9ebb2b194a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1436.962216] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1436.962483] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1437.213927] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86b254ce-7884-4f9d-9a5a-4c5257cb255f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1437.221606] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ccafafc8-565f-4090-a3e0-75f5dbce685a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1437.259055] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e491925-5fa4-453e-89a0-26899f4e7a21 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1437.267795] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f1eea1a-1688-4361-952e-510e962b265a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1437.283819] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1437.295308] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1437.303681] env[68617]: DEBUG nova.network.neutron [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Successfully created port: 2333df80-c18a-4373-968f-007363bc1d2d {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1437.307380] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1437.307775] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.535s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1437.706651] env[68617]: DEBUG nova.compute.manager [req-c933cbed-e528-4a94-a47c-ca055fe9f391 req-6596b63d-3f0d-47ef-b131-ab8ebec6a412 service nova] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Received event network-vif-plugged-45d6edbe-72f6-4bd0-8566-1e1497a8dd58 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1437.706891] env[68617]: DEBUG oslo_concurrency.lockutils [req-c933cbed-e528-4a94-a47c-ca055fe9f391 req-6596b63d-3f0d-47ef-b131-ab8ebec6a412 service nova] Acquiring lock "ee6efd93-25be-4268-afe9-ba39e543a4fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1437.707115] env[68617]: DEBUG oslo_concurrency.lockutils [req-c933cbed-e528-4a94-a47c-ca055fe9f391 req-6596b63d-3f0d-47ef-b131-ab8ebec6a412 service nova] Lock "ee6efd93-25be-4268-afe9-ba39e543a4fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1437.707288] env[68617]: DEBUG oslo_concurrency.lockutils [req-c933cbed-e528-4a94-a47c-ca055fe9f391 req-6596b63d-3f0d-47ef-b131-ab8ebec6a412 service nova] Lock "ee6efd93-25be-4268-afe9-ba39e543a4fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1437.707823] env[68617]: DEBUG nova.compute.manager [req-c933cbed-e528-4a94-a47c-ca055fe9f391 req-6596b63d-3f0d-47ef-b131-ab8ebec6a412 service nova] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] No waiting events found dispatching network-vif-plugged-45d6edbe-72f6-4bd0-8566-1e1497a8dd58 {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1437.708049] env[68617]: WARNING nova.compute.manager [req-c933cbed-e528-4a94-a47c-ca055fe9f391 req-6596b63d-3f0d-47ef-b131-ab8ebec6a412 service nova] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Received unexpected event network-vif-plugged-45d6edbe-72f6-4bd0-8566-1e1497a8dd58 for instance with vm_state building and task_state spawning. [ 1437.795546] env[68617]: DEBUG nova.network.neutron [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Successfully updated port: 45d6edbe-72f6-4bd0-8566-1e1497a8dd58 {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1437.809725] env[68617]: DEBUG oslo_concurrency.lockutils [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Acquiring lock "refresh_cache-ee6efd93-25be-4268-afe9-ba39e543a4fb" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1437.813107] env[68617]: DEBUG oslo_concurrency.lockutils [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Acquired lock "refresh_cache-ee6efd93-25be-4268-afe9-ba39e543a4fb" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1437.813107] env[68617]: DEBUG nova.network.neutron [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1437.869543] env[68617]: DEBUG nova.network.neutron [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1438.138212] env[68617]: DEBUG nova.network.neutron [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Updating instance_info_cache with network_info: [{"id": "45d6edbe-72f6-4bd0-8566-1e1497a8dd58", "address": "fa:16:3e:1e:02:6d", "network": {"id": "eebe905c-f291-4af0-8bb1-d003a9f4955c", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-911698373-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3501bf23a29d45c1b8b06297c58c4439", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f0ef5aba-bd9a-42ff-a1a0-5e763986d70a", "external-id": "nsx-vlan-transportzone-209", "segmentation_id": 209, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap45d6edbe-72", "ovs_interfaceid": "45d6edbe-72f6-4bd0-8566-1e1497a8dd58", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1438.150321] env[68617]: DEBUG oslo_concurrency.lockutils [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Releasing lock "refresh_cache-ee6efd93-25be-4268-afe9-ba39e543a4fb" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1438.150722] env[68617]: DEBUG nova.compute.manager [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Instance network_info: |[{"id": "45d6edbe-72f6-4bd0-8566-1e1497a8dd58", "address": "fa:16:3e:1e:02:6d", "network": {"id": "eebe905c-f291-4af0-8bb1-d003a9f4955c", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-911698373-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3501bf23a29d45c1b8b06297c58c4439", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f0ef5aba-bd9a-42ff-a1a0-5e763986d70a", "external-id": "nsx-vlan-transportzone-209", "segmentation_id": 209, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap45d6edbe-72", "ovs_interfaceid": "45d6edbe-72f6-4bd0-8566-1e1497a8dd58", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1438.151031] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:1e:02:6d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f0ef5aba-bd9a-42ff-a1a0-5e763986d70a', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '45d6edbe-72f6-4bd0-8566-1e1497a8dd58', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1438.158624] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Creating folder: Project (3501bf23a29d45c1b8b06297c58c4439). Parent ref: group-v693691. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1438.159181] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-03138f25-eb2d-4976-be54-b4c76f835743 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1438.170048] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Created folder: Project (3501bf23a29d45c1b8b06297c58c4439) in parent group-v693691. [ 1438.170228] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Creating folder: Instances. Parent ref: group-v693767. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1438.170462] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-716eead0-dda4-4ed0-8d81-d5dc293ead29 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1438.181883] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Created folder: Instances in parent group-v693767. [ 1438.182136] env[68617]: DEBUG oslo.service.loopingcall [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1438.182407] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1438.182657] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f2c4bfe4-113e-4135-94a0-5ee3dd6ec3ba {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1438.200597] env[68617]: DEBUG nova.network.neutron [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Successfully updated port: 2333df80-c18a-4373-968f-007363bc1d2d {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1438.205222] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1438.205222] env[68617]: value = "task-3470824" [ 1438.205222] env[68617]: _type = "Task" [ 1438.205222] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1438.211616] env[68617]: DEBUG oslo_concurrency.lockutils [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "refresh_cache-1605028f-5d6d-4ac4-8416-c0465982c53a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1438.211616] env[68617]: DEBUG oslo_concurrency.lockutils [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquired lock "refresh_cache-1605028f-5d6d-4ac4-8416-c0465982c53a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1438.211616] env[68617]: DEBUG nova.network.neutron [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1438.214721] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470824, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1438.255699] env[68617]: DEBUG nova.network.neutron [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1438.302947] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1438.430609] env[68617]: DEBUG nova.network.neutron [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Updating instance_info_cache with network_info: [{"id": "2333df80-c18a-4373-968f-007363bc1d2d", "address": "fa:16:3e:d2:61:f6", "network": {"id": "1d9c32bb-1c81-4af6-8d3f-365a52df11cd", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-313904480-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1e11c4e5c25a42119594647403c0199b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d62c1cf-f39a-4626-9552-f1e13c692636", "external-id": "nsx-vlan-transportzone-748", "segmentation_id": 748, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2333df80-c1", "ovs_interfaceid": "2333df80-c18a-4373-968f-007363bc1d2d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1438.442268] env[68617]: DEBUG oslo_concurrency.lockutils [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Releasing lock "refresh_cache-1605028f-5d6d-4ac4-8416-c0465982c53a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1438.442579] env[68617]: DEBUG nova.compute.manager [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Instance network_info: |[{"id": "2333df80-c18a-4373-968f-007363bc1d2d", "address": "fa:16:3e:d2:61:f6", "network": {"id": "1d9c32bb-1c81-4af6-8d3f-365a52df11cd", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-313904480-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1e11c4e5c25a42119594647403c0199b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d62c1cf-f39a-4626-9552-f1e13c692636", "external-id": "nsx-vlan-transportzone-748", "segmentation_id": 748, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2333df80-c1", "ovs_interfaceid": "2333df80-c18a-4373-968f-007363bc1d2d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1438.442977] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d2:61:f6', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6d62c1cf-f39a-4626-9552-f1e13c692636', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2333df80-c18a-4373-968f-007363bc1d2d', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1438.450341] env[68617]: DEBUG oslo.service.loopingcall [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1438.450852] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1438.451077] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c0253b55-788c-4e00-895f-28f5367098de {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1438.470661] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1438.470661] env[68617]: value = "task-3470825" [ 1438.470661] env[68617]: _type = "Task" [ 1438.470661] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1438.478643] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470825, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1438.716054] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470824, 'name': CreateVM_Task, 'duration_secs': 0.293406} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1438.716205] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1438.716973] env[68617]: DEBUG oslo_concurrency.lockutils [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1438.717230] env[68617]: DEBUG oslo_concurrency.lockutils [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1438.717633] env[68617]: DEBUG oslo_concurrency.lockutils [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1438.717955] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-67600d14-059a-4e24-b803-a2c858de7bc2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1438.723607] env[68617]: DEBUG oslo_vmware.api [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Waiting for the task: (returnval){ [ 1438.723607] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]524a8efb-6072-725f-5adc-1b15efd5cfa1" [ 1438.723607] env[68617]: _type = "Task" [ 1438.723607] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1438.733398] env[68617]: DEBUG oslo_vmware.api [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]524a8efb-6072-725f-5adc-1b15efd5cfa1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1438.981065] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470825, 'name': CreateVM_Task, 'duration_secs': 0.28776} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1438.981065] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1438.981331] env[68617]: DEBUG oslo_concurrency.lockutils [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1439.234435] env[68617]: DEBUG oslo_concurrency.lockutils [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1439.234806] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1439.234953] env[68617]: DEBUG oslo_concurrency.lockutils [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1439.235189] env[68617]: DEBUG oslo_concurrency.lockutils [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1439.235495] env[68617]: DEBUG oslo_concurrency.lockutils [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1439.235793] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-34839c54-bcb7-46a4-8b1d-fbbec9e42319 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1439.240180] env[68617]: DEBUG oslo_vmware.api [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Waiting for the task: (returnval){ [ 1439.240180] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]5221868b-db89-f965-fc96-ec727389ff49" [ 1439.240180] env[68617]: _type = "Task" [ 1439.240180] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1439.248498] env[68617]: DEBUG oslo_vmware.api [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]5221868b-db89-f965-fc96-ec727389ff49, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1439.754211] env[68617]: DEBUG oslo_concurrency.lockutils [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1439.754897] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1439.754897] env[68617]: DEBUG oslo_concurrency.lockutils [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1439.769760] env[68617]: DEBUG nova.compute.manager [req-3ddf28eb-19d1-40df-b054-ba0f21c663e2 req-0b581e93-973b-416e-a18b-27c9d056725f service nova] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Received event network-changed-45d6edbe-72f6-4bd0-8566-1e1497a8dd58 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1439.770044] env[68617]: DEBUG nova.compute.manager [req-3ddf28eb-19d1-40df-b054-ba0f21c663e2 req-0b581e93-973b-416e-a18b-27c9d056725f service nova] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Refreshing instance network info cache due to event network-changed-45d6edbe-72f6-4bd0-8566-1e1497a8dd58. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1439.770427] env[68617]: DEBUG oslo_concurrency.lockutils [req-3ddf28eb-19d1-40df-b054-ba0f21c663e2 req-0b581e93-973b-416e-a18b-27c9d056725f service nova] Acquiring lock "refresh_cache-ee6efd93-25be-4268-afe9-ba39e543a4fb" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1439.770580] env[68617]: DEBUG oslo_concurrency.lockutils [req-3ddf28eb-19d1-40df-b054-ba0f21c663e2 req-0b581e93-973b-416e-a18b-27c9d056725f service nova] Acquired lock "refresh_cache-ee6efd93-25be-4268-afe9-ba39e543a4fb" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1439.770860] env[68617]: DEBUG nova.network.neutron [req-3ddf28eb-19d1-40df-b054-ba0f21c663e2 req-0b581e93-973b-416e-a18b-27c9d056725f service nova] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Refreshing network info cache for port 45d6edbe-72f6-4bd0-8566-1e1497a8dd58 {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1440.016847] env[68617]: DEBUG nova.network.neutron [req-3ddf28eb-19d1-40df-b054-ba0f21c663e2 req-0b581e93-973b-416e-a18b-27c9d056725f service nova] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Updated VIF entry in instance network info cache for port 45d6edbe-72f6-4bd0-8566-1e1497a8dd58. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1440.017226] env[68617]: DEBUG nova.network.neutron [req-3ddf28eb-19d1-40df-b054-ba0f21c663e2 req-0b581e93-973b-416e-a18b-27c9d056725f service nova] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Updating instance_info_cache with network_info: [{"id": "45d6edbe-72f6-4bd0-8566-1e1497a8dd58", "address": "fa:16:3e:1e:02:6d", "network": {"id": "eebe905c-f291-4af0-8bb1-d003a9f4955c", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-911698373-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3501bf23a29d45c1b8b06297c58c4439", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f0ef5aba-bd9a-42ff-a1a0-5e763986d70a", "external-id": "nsx-vlan-transportzone-209", "segmentation_id": 209, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap45d6edbe-72", "ovs_interfaceid": "45d6edbe-72f6-4bd0-8566-1e1497a8dd58", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1440.026294] env[68617]: DEBUG oslo_concurrency.lockutils [req-3ddf28eb-19d1-40df-b054-ba0f21c663e2 req-0b581e93-973b-416e-a18b-27c9d056725f service nova] Releasing lock "refresh_cache-ee6efd93-25be-4268-afe9-ba39e543a4fb" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1440.026525] env[68617]: DEBUG nova.compute.manager [req-3ddf28eb-19d1-40df-b054-ba0f21c663e2 req-0b581e93-973b-416e-a18b-27c9d056725f service nova] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Received event network-vif-plugged-2333df80-c18a-4373-968f-007363bc1d2d {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1440.026748] env[68617]: DEBUG oslo_concurrency.lockutils [req-3ddf28eb-19d1-40df-b054-ba0f21c663e2 req-0b581e93-973b-416e-a18b-27c9d056725f service nova] Acquiring lock "1605028f-5d6d-4ac4-8416-c0465982c53a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1440.026948] env[68617]: DEBUG oslo_concurrency.lockutils [req-3ddf28eb-19d1-40df-b054-ba0f21c663e2 req-0b581e93-973b-416e-a18b-27c9d056725f service nova] Lock "1605028f-5d6d-4ac4-8416-c0465982c53a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1440.027126] env[68617]: DEBUG oslo_concurrency.lockutils [req-3ddf28eb-19d1-40df-b054-ba0f21c663e2 req-0b581e93-973b-416e-a18b-27c9d056725f service nova] Lock "1605028f-5d6d-4ac4-8416-c0465982c53a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1440.027289] env[68617]: DEBUG nova.compute.manager [req-3ddf28eb-19d1-40df-b054-ba0f21c663e2 req-0b581e93-973b-416e-a18b-27c9d056725f service nova] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] No waiting events found dispatching network-vif-plugged-2333df80-c18a-4373-968f-007363bc1d2d {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1440.027448] env[68617]: WARNING nova.compute.manager [req-3ddf28eb-19d1-40df-b054-ba0f21c663e2 req-0b581e93-973b-416e-a18b-27c9d056725f service nova] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Received unexpected event network-vif-plugged-2333df80-c18a-4373-968f-007363bc1d2d for instance with vm_state building and task_state spawning. [ 1440.027606] env[68617]: DEBUG nova.compute.manager [req-3ddf28eb-19d1-40df-b054-ba0f21c663e2 req-0b581e93-973b-416e-a18b-27c9d056725f service nova] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Received event network-changed-2333df80-c18a-4373-968f-007363bc1d2d {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1440.027756] env[68617]: DEBUG nova.compute.manager [req-3ddf28eb-19d1-40df-b054-ba0f21c663e2 req-0b581e93-973b-416e-a18b-27c9d056725f service nova] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Refreshing instance network info cache due to event network-changed-2333df80-c18a-4373-968f-007363bc1d2d. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1440.027932] env[68617]: DEBUG oslo_concurrency.lockutils [req-3ddf28eb-19d1-40df-b054-ba0f21c663e2 req-0b581e93-973b-416e-a18b-27c9d056725f service nova] Acquiring lock "refresh_cache-1605028f-5d6d-4ac4-8416-c0465982c53a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1440.028079] env[68617]: DEBUG oslo_concurrency.lockutils [req-3ddf28eb-19d1-40df-b054-ba0f21c663e2 req-0b581e93-973b-416e-a18b-27c9d056725f service nova] Acquired lock "refresh_cache-1605028f-5d6d-4ac4-8416-c0465982c53a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1440.028268] env[68617]: DEBUG nova.network.neutron [req-3ddf28eb-19d1-40df-b054-ba0f21c663e2 req-0b581e93-973b-416e-a18b-27c9d056725f service nova] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Refreshing network info cache for port 2333df80-c18a-4373-968f-007363bc1d2d {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1440.254456] env[68617]: DEBUG nova.network.neutron [req-3ddf28eb-19d1-40df-b054-ba0f21c663e2 req-0b581e93-973b-416e-a18b-27c9d056725f service nova] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Updated VIF entry in instance network info cache for port 2333df80-c18a-4373-968f-007363bc1d2d. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1440.254797] env[68617]: DEBUG nova.network.neutron [req-3ddf28eb-19d1-40df-b054-ba0f21c663e2 req-0b581e93-973b-416e-a18b-27c9d056725f service nova] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Updating instance_info_cache with network_info: [{"id": "2333df80-c18a-4373-968f-007363bc1d2d", "address": "fa:16:3e:d2:61:f6", "network": {"id": "1d9c32bb-1c81-4af6-8d3f-365a52df11cd", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-313904480-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1e11c4e5c25a42119594647403c0199b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d62c1cf-f39a-4626-9552-f1e13c692636", "external-id": "nsx-vlan-transportzone-748", "segmentation_id": 748, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2333df80-c1", "ovs_interfaceid": "2333df80-c18a-4373-968f-007363bc1d2d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1440.263762] env[68617]: DEBUG oslo_concurrency.lockutils [req-3ddf28eb-19d1-40df-b054-ba0f21c663e2 req-0b581e93-973b-416e-a18b-27c9d056725f service nova] Releasing lock "refresh_cache-1605028f-5d6d-4ac4-8416-c0465982c53a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1482.883845] env[68617]: WARNING oslo_vmware.rw_handles [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1482.883845] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1482.883845] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1482.883845] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1482.883845] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1482.883845] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 1482.883845] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1482.883845] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1482.883845] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1482.883845] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1482.883845] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1482.883845] env[68617]: ERROR oslo_vmware.rw_handles [ 1482.885825] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/48ace648-4811-4e5e-8fc1-ab74e11dd2c7/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1482.886391] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1482.886643] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Copying Virtual Disk [datastore2] vmware_temp/48ace648-4811-4e5e-8fc1-ab74e11dd2c7/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/48ace648-4811-4e5e-8fc1-ab74e11dd2c7/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1482.886938] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-7a140f02-eec5-44aa-be6f-56036493bd3b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1482.895408] env[68617]: DEBUG oslo_vmware.api [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Waiting for the task: (returnval){ [ 1482.895408] env[68617]: value = "task-3470826" [ 1482.895408] env[68617]: _type = "Task" [ 1482.895408] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1482.903193] env[68617]: DEBUG oslo_vmware.api [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Task: {'id': task-3470826, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1483.406539] env[68617]: DEBUG oslo_vmware.exceptions [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1483.407178] env[68617]: DEBUG oslo_concurrency.lockutils [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1483.407445] env[68617]: ERROR nova.compute.manager [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1483.407445] env[68617]: Faults: ['InvalidArgument'] [ 1483.407445] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Traceback (most recent call last): [ 1483.407445] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1483.407445] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] yield resources [ 1483.407445] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1483.407445] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] self.driver.spawn(context, instance, image_meta, [ 1483.407445] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1483.407445] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1483.407445] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1483.407445] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] self._fetch_image_if_missing(context, vi) [ 1483.407445] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1483.407788] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] image_cache(vi, tmp_image_ds_loc) [ 1483.407788] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1483.407788] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] vm_util.copy_virtual_disk( [ 1483.407788] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1483.407788] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] session._wait_for_task(vmdk_copy_task) [ 1483.407788] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1483.407788] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] return self.wait_for_task(task_ref) [ 1483.407788] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1483.407788] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] return evt.wait() [ 1483.407788] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1483.407788] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] result = hub.switch() [ 1483.407788] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1483.407788] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] return self.greenlet.switch() [ 1483.408385] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1483.408385] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] self.f(*self.args, **self.kw) [ 1483.408385] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1483.408385] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] raise exceptions.translate_fault(task_info.error) [ 1483.408385] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1483.408385] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Faults: ['InvalidArgument'] [ 1483.408385] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] [ 1483.408385] env[68617]: INFO nova.compute.manager [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Terminating instance [ 1483.409436] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1483.409611] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1483.409872] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b2d12db2-a41d-4f32-868f-dcf331a9e421 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1483.412115] env[68617]: DEBUG nova.compute.manager [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1483.412319] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1483.413040] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce607deb-97cf-421f-a68a-cd1d2dbcf6f0 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1483.420056] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1483.420301] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-709421d9-a628-400e-8ff7-f0751ed2c86d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1483.422530] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1483.422703] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1483.423639] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-64fc7155-bc05-4f8d-bf25-101f016a3486 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1483.428764] env[68617]: DEBUG oslo_vmware.api [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Waiting for the task: (returnval){ [ 1483.428764] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]523b2abb-0744-04b8-6a23-d756345a914d" [ 1483.428764] env[68617]: _type = "Task" [ 1483.428764] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1483.436346] env[68617]: DEBUG oslo_vmware.api [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]523b2abb-0744-04b8-6a23-d756345a914d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1483.495435] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1483.495651] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1483.495818] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Deleting the datastore file [datastore2] dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908 {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1483.496092] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-70e4a8f5-5a3a-46fa-83f8-dc23ef89c811 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1483.502208] env[68617]: DEBUG oslo_vmware.api [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Waiting for the task: (returnval){ [ 1483.502208] env[68617]: value = "task-3470828" [ 1483.502208] env[68617]: _type = "Task" [ 1483.502208] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1483.509448] env[68617]: DEBUG oslo_vmware.api [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Task: {'id': task-3470828, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1483.938440] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1483.938722] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Creating directory with path [datastore2] vmware_temp/119ad5eb-c189-414b-88a7-8183a891e48e/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1483.938947] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9306b1df-7035-4e37-95c1-c5d39a0177bd {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1483.954777] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Created directory with path [datastore2] vmware_temp/119ad5eb-c189-414b-88a7-8183a891e48e/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1483.955039] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Fetch image to [datastore2] vmware_temp/119ad5eb-c189-414b-88a7-8183a891e48e/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1483.955233] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/119ad5eb-c189-414b-88a7-8183a891e48e/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1483.955946] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a76fbd5-d911-43c9-bcf1-851be9cc8486 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1483.962413] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d08521b2-9b13-442e-bf25-9ae7fda572a8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1483.971225] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55bb946f-8705-42be-b557-b6ca71f6dba9 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.002634] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78f94586-54e9-4a5b-a90e-7385305167d4 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.013196] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-cd078a6f-7a37-4c2e-ba59-188f479f0e1d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.014821] env[68617]: DEBUG oslo_vmware.api [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Task: {'id': task-3470828, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069173} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1484.015096] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1484.015282] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1484.015448] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1484.015619] env[68617]: INFO nova.compute.manager [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1484.017658] env[68617]: DEBUG nova.compute.claims [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1484.017835] env[68617]: DEBUG oslo_concurrency.lockutils [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1484.018061] env[68617]: DEBUG oslo_concurrency.lockutils [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1484.037777] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1484.088222] env[68617]: DEBUG oslo_vmware.rw_handles [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/119ad5eb-c189-414b-88a7-8183a891e48e/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1484.148186] env[68617]: DEBUG oslo_vmware.rw_handles [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1484.148385] env[68617]: DEBUG oslo_vmware.rw_handles [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/119ad5eb-c189-414b-88a7-8183a891e48e/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1484.324846] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23adc012-0eb2-4f3f-b938-399d22bb2912 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.332754] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28611bc3-fb07-48ac-a1e6-bc0e39cf18c7 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.361499] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8545733-44e5-44a2-bbdb-b211126136cf {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.368303] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e15dbc5-6f09-4645-a640-fad5bd8f9971 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.381913] env[68617]: DEBUG nova.compute.provider_tree [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1484.390620] env[68617]: DEBUG nova.scheduler.client.report [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1484.404369] env[68617]: DEBUG oslo_concurrency.lockutils [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.386s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1484.404874] env[68617]: ERROR nova.compute.manager [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1484.404874] env[68617]: Faults: ['InvalidArgument'] [ 1484.404874] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Traceback (most recent call last): [ 1484.404874] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1484.404874] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] self.driver.spawn(context, instance, image_meta, [ 1484.404874] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1484.404874] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1484.404874] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1484.404874] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] self._fetch_image_if_missing(context, vi) [ 1484.404874] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1484.404874] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] image_cache(vi, tmp_image_ds_loc) [ 1484.404874] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1484.405245] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] vm_util.copy_virtual_disk( [ 1484.405245] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1484.405245] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] session._wait_for_task(vmdk_copy_task) [ 1484.405245] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1484.405245] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] return self.wait_for_task(task_ref) [ 1484.405245] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1484.405245] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] return evt.wait() [ 1484.405245] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1484.405245] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] result = hub.switch() [ 1484.405245] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1484.405245] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] return self.greenlet.switch() [ 1484.405245] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1484.405245] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] self.f(*self.args, **self.kw) [ 1484.405525] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1484.405525] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] raise exceptions.translate_fault(task_info.error) [ 1484.405525] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1484.405525] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Faults: ['InvalidArgument'] [ 1484.405525] env[68617]: ERROR nova.compute.manager [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] [ 1484.405639] env[68617]: DEBUG nova.compute.utils [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1484.407372] env[68617]: DEBUG nova.compute.manager [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Build of instance dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908 was re-scheduled: A specified parameter was not correct: fileType [ 1484.407372] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1484.407769] env[68617]: DEBUG nova.compute.manager [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1484.407939] env[68617]: DEBUG nova.compute.manager [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1484.408123] env[68617]: DEBUG nova.compute.manager [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1484.408304] env[68617]: DEBUG nova.network.neutron [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1484.791544] env[68617]: DEBUG nova.network.neutron [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1484.806054] env[68617]: INFO nova.compute.manager [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Took 0.40 seconds to deallocate network for instance. [ 1484.909098] env[68617]: INFO nova.scheduler.client.report [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Deleted allocations for instance dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908 [ 1484.936721] env[68617]: DEBUG oslo_concurrency.lockutils [None req-56b8f249-9ce2-4c0b-8b94-b265dc2c9b19 tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Lock "dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 636.247s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1484.938112] env[68617]: DEBUG oslo_concurrency.lockutils [None req-de0db407-7a80-4611-b802-00eb6401471a tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Lock "dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 439.725s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1484.938370] env[68617]: DEBUG oslo_concurrency.lockutils [None req-de0db407-7a80-4611-b802-00eb6401471a tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Acquiring lock "dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1484.938586] env[68617]: DEBUG oslo_concurrency.lockutils [None req-de0db407-7a80-4611-b802-00eb6401471a tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Lock "dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1484.938827] env[68617]: DEBUG oslo_concurrency.lockutils [None req-de0db407-7a80-4611-b802-00eb6401471a tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Lock "dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1484.941561] env[68617]: INFO nova.compute.manager [None req-de0db407-7a80-4611-b802-00eb6401471a tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Terminating instance [ 1484.943609] env[68617]: DEBUG nova.compute.manager [None req-de0db407-7a80-4611-b802-00eb6401471a tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1484.943979] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-de0db407-7a80-4611-b802-00eb6401471a tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1484.944379] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-fb84801c-5afb-4604-9d4f-c66973edcf9c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.953978] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a2b1b14-584b-4d7d-9f37-94787476b204 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.965617] env[68617]: DEBUG nova.compute.manager [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1484.986479] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-de0db407-7a80-4611-b802-00eb6401471a tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908 could not be found. [ 1484.986556] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-de0db407-7a80-4611-b802-00eb6401471a tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1484.986726] env[68617]: INFO nova.compute.manager [None req-de0db407-7a80-4611-b802-00eb6401471a tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1484.987397] env[68617]: DEBUG oslo.service.loopingcall [None req-de0db407-7a80-4611-b802-00eb6401471a tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1484.987397] env[68617]: DEBUG nova.compute.manager [-] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1484.987397] env[68617]: DEBUG nova.network.neutron [-] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1485.013337] env[68617]: DEBUG nova.network.neutron [-] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1485.020615] env[68617]: INFO nova.compute.manager [-] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] Took 0.03 seconds to deallocate network for instance. [ 1485.040788] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1485.041034] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1485.042544] env[68617]: INFO nova.compute.claims [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1485.117288] env[68617]: DEBUG oslo_concurrency.lockutils [None req-de0db407-7a80-4611-b802-00eb6401471a tempest-ServersAdminTestJSON-2141123961 tempest-ServersAdminTestJSON-2141123961-project-member] Lock "dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.179s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1485.118295] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 195.577s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1485.118396] env[68617]: INFO nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908] During sync_power_state the instance has a pending task (deleting). Skip. [ 1485.118552] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "dbbe6fb0-b9f2-4d42-a0b1-a733f1b5f908" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1485.288900] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66047881-ed50-4808-86fe-1a2b44380e3f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1485.296491] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-040fce27-3518-426d-8a9c-9a502e000eb4 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1485.325048] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-341afe38-3a29-4135-87c0-b3105a75c3a6 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1485.331887] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8b264fd-26fb-4623-bb72-bd43f0db5931 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1485.344536] env[68617]: DEBUG nova.compute.provider_tree [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1485.352185] env[68617]: DEBUG nova.scheduler.client.report [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1485.366258] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.325s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1485.366690] env[68617]: DEBUG nova.compute.manager [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1485.399564] env[68617]: DEBUG nova.compute.utils [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1485.401019] env[68617]: DEBUG nova.compute.manager [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1485.401107] env[68617]: DEBUG nova.network.neutron [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1485.409298] env[68617]: DEBUG nova.compute.manager [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1485.463992] env[68617]: DEBUG nova.policy [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '224a4101e01748579f093e7116ca2a1a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b191b6855afd48fb9335661e492e3d39', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 1485.471398] env[68617]: DEBUG nova.compute.manager [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1485.497763] env[68617]: DEBUG nova.virt.hardware [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1485.497999] env[68617]: DEBUG nova.virt.hardware [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1485.498177] env[68617]: DEBUG nova.virt.hardware [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1485.498354] env[68617]: DEBUG nova.virt.hardware [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1485.498498] env[68617]: DEBUG nova.virt.hardware [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1485.498642] env[68617]: DEBUG nova.virt.hardware [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1485.498972] env[68617]: DEBUG nova.virt.hardware [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1485.499166] env[68617]: DEBUG nova.virt.hardware [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1485.499536] env[68617]: DEBUG nova.virt.hardware [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1485.499705] env[68617]: DEBUG nova.virt.hardware [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1485.499882] env[68617]: DEBUG nova.virt.hardware [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1485.500708] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51cf9b8e-5692-439a-bafe-f48cba55a1ae {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1485.510291] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7510f169-e252-43ae-82cc-639c62704b0b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1485.775244] env[68617]: DEBUG nova.network.neutron [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Successfully created port: e42a452b-3ca1-4ce6-9f19-4e35c229ed17 {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1486.555899] env[68617]: DEBUG nova.network.neutron [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Successfully updated port: e42a452b-3ca1-4ce6-9f19-4e35c229ed17 {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1486.568430] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Acquiring lock "refresh_cache-fc1043b8-535d-4af0-b92b-1f43580cdc9a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1486.568570] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Acquired lock "refresh_cache-fc1043b8-535d-4af0-b92b-1f43580cdc9a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1486.568717] env[68617]: DEBUG nova.network.neutron [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1486.607962] env[68617]: DEBUG nova.network.neutron [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1486.787497] env[68617]: DEBUG nova.network.neutron [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Updating instance_info_cache with network_info: [{"id": "e42a452b-3ca1-4ce6-9f19-4e35c229ed17", "address": "fa:16:3e:e5:3b:89", "network": {"id": "6d8ddf36-28a8-4ec5-8fb8-d3577062a14c", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-512079755-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b191b6855afd48fb9335661e492e3d39", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ce17e10e-2fb0-4191-afee-e2b89fa15074", "external-id": "nsx-vlan-transportzone-352", "segmentation_id": 352, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape42a452b-3c", "ovs_interfaceid": "e42a452b-3ca1-4ce6-9f19-4e35c229ed17", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1486.799864] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Releasing lock "refresh_cache-fc1043b8-535d-4af0-b92b-1f43580cdc9a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1486.800161] env[68617]: DEBUG nova.compute.manager [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Instance network_info: |[{"id": "e42a452b-3ca1-4ce6-9f19-4e35c229ed17", "address": "fa:16:3e:e5:3b:89", "network": {"id": "6d8ddf36-28a8-4ec5-8fb8-d3577062a14c", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-512079755-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b191b6855afd48fb9335661e492e3d39", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ce17e10e-2fb0-4191-afee-e2b89fa15074", "external-id": "nsx-vlan-transportzone-352", "segmentation_id": 352, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape42a452b-3c", "ovs_interfaceid": "e42a452b-3ca1-4ce6-9f19-4e35c229ed17", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1486.800539] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e5:3b:89', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ce17e10e-2fb0-4191-afee-e2b89fa15074', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e42a452b-3ca1-4ce6-9f19-4e35c229ed17', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1486.807970] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Creating folder: Project (b191b6855afd48fb9335661e492e3d39). Parent ref: group-v693691. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1486.808476] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-49426d5d-f41c-4fc8-a38c-3f3d4ad0aef6 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1486.820892] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Created folder: Project (b191b6855afd48fb9335661e492e3d39) in parent group-v693691. [ 1486.821070] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Creating folder: Instances. Parent ref: group-v693771. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1486.821285] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2fcf3389-a02b-4999-95ad-2e7ba84a5d2f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1486.830998] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Created folder: Instances in parent group-v693771. [ 1486.831238] env[68617]: DEBUG oslo.service.loopingcall [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1486.831412] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1486.831595] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-9623f69a-3767-4f2b-b4dc-6f58375c4c0d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1486.850755] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1486.850755] env[68617]: value = "task-3470831" [ 1486.850755] env[68617]: _type = "Task" [ 1486.850755] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1486.859583] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470831, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1486.880300] env[68617]: DEBUG nova.compute.manager [req-beb825e2-20d9-48a2-b665-65d0d52c7883 req-0ae11504-b449-4c81-818c-3403b2aacc75 service nova] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Received event network-vif-plugged-e42a452b-3ca1-4ce6-9f19-4e35c229ed17 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1486.880417] env[68617]: DEBUG oslo_concurrency.lockutils [req-beb825e2-20d9-48a2-b665-65d0d52c7883 req-0ae11504-b449-4c81-818c-3403b2aacc75 service nova] Acquiring lock "fc1043b8-535d-4af0-b92b-1f43580cdc9a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1486.880702] env[68617]: DEBUG oslo_concurrency.lockutils [req-beb825e2-20d9-48a2-b665-65d0d52c7883 req-0ae11504-b449-4c81-818c-3403b2aacc75 service nova] Lock "fc1043b8-535d-4af0-b92b-1f43580cdc9a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1486.880932] env[68617]: DEBUG oslo_concurrency.lockutils [req-beb825e2-20d9-48a2-b665-65d0d52c7883 req-0ae11504-b449-4c81-818c-3403b2aacc75 service nova] Lock "fc1043b8-535d-4af0-b92b-1f43580cdc9a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1486.881174] env[68617]: DEBUG nova.compute.manager [req-beb825e2-20d9-48a2-b665-65d0d52c7883 req-0ae11504-b449-4c81-818c-3403b2aacc75 service nova] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] No waiting events found dispatching network-vif-plugged-e42a452b-3ca1-4ce6-9f19-4e35c229ed17 {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1486.881478] env[68617]: WARNING nova.compute.manager [req-beb825e2-20d9-48a2-b665-65d0d52c7883 req-0ae11504-b449-4c81-818c-3403b2aacc75 service nova] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Received unexpected event network-vif-plugged-e42a452b-3ca1-4ce6-9f19-4e35c229ed17 for instance with vm_state building and task_state spawning. [ 1486.881612] env[68617]: DEBUG nova.compute.manager [req-beb825e2-20d9-48a2-b665-65d0d52c7883 req-0ae11504-b449-4c81-818c-3403b2aacc75 service nova] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Received event network-changed-e42a452b-3ca1-4ce6-9f19-4e35c229ed17 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1486.881827] env[68617]: DEBUG nova.compute.manager [req-beb825e2-20d9-48a2-b665-65d0d52c7883 req-0ae11504-b449-4c81-818c-3403b2aacc75 service nova] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Refreshing instance network info cache due to event network-changed-e42a452b-3ca1-4ce6-9f19-4e35c229ed17. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1486.882102] env[68617]: DEBUG oslo_concurrency.lockutils [req-beb825e2-20d9-48a2-b665-65d0d52c7883 req-0ae11504-b449-4c81-818c-3403b2aacc75 service nova] Acquiring lock "refresh_cache-fc1043b8-535d-4af0-b92b-1f43580cdc9a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1486.882302] env[68617]: DEBUG oslo_concurrency.lockutils [req-beb825e2-20d9-48a2-b665-65d0d52c7883 req-0ae11504-b449-4c81-818c-3403b2aacc75 service nova] Acquired lock "refresh_cache-fc1043b8-535d-4af0-b92b-1f43580cdc9a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1486.882525] env[68617]: DEBUG nova.network.neutron [req-beb825e2-20d9-48a2-b665-65d0d52c7883 req-0ae11504-b449-4c81-818c-3403b2aacc75 service nova] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Refreshing network info cache for port e42a452b-3ca1-4ce6-9f19-4e35c229ed17 {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1487.138221] env[68617]: DEBUG nova.network.neutron [req-beb825e2-20d9-48a2-b665-65d0d52c7883 req-0ae11504-b449-4c81-818c-3403b2aacc75 service nova] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Updated VIF entry in instance network info cache for port e42a452b-3ca1-4ce6-9f19-4e35c229ed17. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1487.138600] env[68617]: DEBUG nova.network.neutron [req-beb825e2-20d9-48a2-b665-65d0d52c7883 req-0ae11504-b449-4c81-818c-3403b2aacc75 service nova] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Updating instance_info_cache with network_info: [{"id": "e42a452b-3ca1-4ce6-9f19-4e35c229ed17", "address": "fa:16:3e:e5:3b:89", "network": {"id": "6d8ddf36-28a8-4ec5-8fb8-d3577062a14c", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-512079755-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b191b6855afd48fb9335661e492e3d39", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ce17e10e-2fb0-4191-afee-e2b89fa15074", "external-id": "nsx-vlan-transportzone-352", "segmentation_id": 352, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape42a452b-3c", "ovs_interfaceid": "e42a452b-3ca1-4ce6-9f19-4e35c229ed17", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1487.148401] env[68617]: DEBUG oslo_concurrency.lockutils [req-beb825e2-20d9-48a2-b665-65d0d52c7883 req-0ae11504-b449-4c81-818c-3403b2aacc75 service nova] Releasing lock "refresh_cache-fc1043b8-535d-4af0-b92b-1f43580cdc9a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1487.360532] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470831, 'name': CreateVM_Task, 'duration_secs': 0.293518} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1487.360782] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1487.361407] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1487.361572] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1487.361887] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1487.362704] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3af02751-b9a0-43a6-850c-dc58994a0494 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1487.366930] env[68617]: DEBUG oslo_vmware.api [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Waiting for the task: (returnval){ [ 1487.366930] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]523ffc36-437f-2e73-6129-41a484b8d08d" [ 1487.366930] env[68617]: _type = "Task" [ 1487.366930] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1487.375199] env[68617]: DEBUG oslo_vmware.api [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]523ffc36-437f-2e73-6129-41a484b8d08d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1487.699065] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1487.699065] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Starting heal instance info cache {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1487.699065] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Rebuilding the list of instances to heal {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1487.720486] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1487.720629] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1487.720767] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1487.720911] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1487.721045] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1487.721171] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1487.721291] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1487.721411] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1487.721530] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1487.721648] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1487.721779] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Didn't find any instances for network info cache update. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1487.877337] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1487.877574] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1487.877784] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1490.699203] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1491.699059] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1491.699329] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1492.699582] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1494.694936] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1494.724892] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1494.725224] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68617) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1495.699509] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1495.732880] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e6e8305f-8d9a-4bf2-a290-8f8b5922463c tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Acquiring lock "ee6efd93-25be-4268-afe9-ba39e543a4fb" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1497.694892] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1497.698539] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1497.710159] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1497.710240] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1497.710394] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1497.710551] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1497.711676] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0929f44e-ad01-46c3-a0f3-42e9b23165f7 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1497.721042] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2af8e9e-ecc5-46e5-8ed9-08d424635f69 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1497.734904] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd20e50a-ea6d-47aa-80c9-8f3649d9bded {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1497.741058] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4922d741-65eb-4636-8d25-6b8145fe2794 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1497.771039] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180884MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1497.771218] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1497.771387] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1497.847919] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 79c92a1b-20ef-4360-93b4-913cbfcf92fe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1497.847919] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 1cc42c7f-8781-40b0-9f75-edfef3bc90e7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1497.847919] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance d46ca6f3-0ee9-412c-98b4-f639ce4f9228 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1497.848088] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance a8ff6232-530c-453a-96e4-f8ce00f976e3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1497.848126] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1497.848224] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance e90877a8-47d3-47d7-8362-5bcfe3a98c36 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1497.848343] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f03b9bc5-9438-4c0c-b595-72c631bece08 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1497.848458] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance ee6efd93-25be-4268-afe9-ba39e543a4fb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1497.848570] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 1605028f-5d6d-4ac4-8416-c0465982c53a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1497.848680] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance fc1043b8-535d-4af0-b92b-1f43580cdc9a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1497.860373] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1497.870486] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance a019d654-82ed-4ef2-850f-39a1f324566a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1497.880198] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 43495abf-8f99-4f51-81ca-80a43c266695 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1497.890957] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b9d0b85a-f0ac-4f9e-bec4-a82db0eb96c3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1497.900365] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 5d294d66-266f-4a0b-be49-5061fb65b226 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1497.910202] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f8c0a514-7e7f-455a-b84d-9afc2957945c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1497.920423] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 9ca297f6-3239-48d3-9b67-dd1637a3bc25 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1497.929997] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 57cdcf44-576a-4343-9277-4b9ebb2b194a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1497.930249] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1497.930475] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1497.943821] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1dd8aa56-44f4-4ed5-b4c1-0de8a55f937b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "1605028f-5d6d-4ac4-8416-c0465982c53a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1498.136788] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-358782b1-40a2-452d-b6d9-ad93ed39cc05 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1498.144461] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45891277-e973-4ac7-a0aa-592da3fc838c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1498.173966] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b72820d8-c435-47e1-a6cf-3f31e2fb24ed {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1498.180767] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a48811a-150f-4158-8d0a-109d04ab9ad8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1498.193339] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1498.201400] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1498.214027] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1498.214027] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.443s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1522.790770] env[68617]: DEBUG oslo_concurrency.lockutils [None req-40dee1cc-64c4-43cd-b779-cde0e29e04da tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Acquiring lock "fc1043b8-535d-4af0-b92b-1f43580cdc9a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1532.407827] env[68617]: WARNING oslo_vmware.rw_handles [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1532.407827] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1532.407827] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1532.407827] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1532.407827] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1532.407827] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 1532.407827] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1532.407827] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1532.407827] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1532.407827] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1532.407827] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1532.407827] env[68617]: ERROR oslo_vmware.rw_handles [ 1532.408442] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/119ad5eb-c189-414b-88a7-8183a891e48e/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1532.410142] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1532.410388] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Copying Virtual Disk [datastore2] vmware_temp/119ad5eb-c189-414b-88a7-8183a891e48e/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/119ad5eb-c189-414b-88a7-8183a891e48e/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1532.410661] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9aa2956f-3a37-436c-83ed-17706eb2dc0e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1532.418958] env[68617]: DEBUG oslo_vmware.api [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Waiting for the task: (returnval){ [ 1532.418958] env[68617]: value = "task-3470832" [ 1532.418958] env[68617]: _type = "Task" [ 1532.418958] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1532.426476] env[68617]: DEBUG oslo_vmware.api [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Task: {'id': task-3470832, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1532.930193] env[68617]: DEBUG oslo_vmware.exceptions [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1532.930526] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1532.931156] env[68617]: ERROR nova.compute.manager [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1532.931156] env[68617]: Faults: ['InvalidArgument'] [ 1532.931156] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Traceback (most recent call last): [ 1532.931156] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1532.931156] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] yield resources [ 1532.931156] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1532.931156] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] self.driver.spawn(context, instance, image_meta, [ 1532.931156] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1532.931156] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1532.931156] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1532.931156] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] self._fetch_image_if_missing(context, vi) [ 1532.931156] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1532.931519] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] image_cache(vi, tmp_image_ds_loc) [ 1532.931519] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1532.931519] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] vm_util.copy_virtual_disk( [ 1532.931519] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1532.931519] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] session._wait_for_task(vmdk_copy_task) [ 1532.931519] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1532.931519] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] return self.wait_for_task(task_ref) [ 1532.931519] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1532.931519] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] return evt.wait() [ 1532.931519] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1532.931519] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] result = hub.switch() [ 1532.931519] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1532.931519] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] return self.greenlet.switch() [ 1532.931873] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1532.931873] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] self.f(*self.args, **self.kw) [ 1532.931873] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1532.931873] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] raise exceptions.translate_fault(task_info.error) [ 1532.931873] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1532.931873] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Faults: ['InvalidArgument'] [ 1532.931873] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] [ 1532.931873] env[68617]: INFO nova.compute.manager [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Terminating instance [ 1532.933216] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1532.933455] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1532.933725] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-80a36224-686b-4b9d-80ec-9e2377fe5af8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1532.936868] env[68617]: DEBUG nova.compute.manager [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1532.937111] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1532.937905] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38f48ec9-0746-4f8b-88c1-686b610076bd {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1532.945902] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1532.947029] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f8f22f39-8270-4ad2-b862-d3b0a49163d4 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1532.948495] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1532.948666] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1532.949317] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-63be1c01-8472-4a83-b5cc-4ad52c07b16b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1532.954656] env[68617]: DEBUG oslo_vmware.api [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Waiting for the task: (returnval){ [ 1532.954656] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]526939f3-4b02-0585-3f1a-c1b7c090c4b6" [ 1532.954656] env[68617]: _type = "Task" [ 1532.954656] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1532.962298] env[68617]: DEBUG oslo_vmware.api [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]526939f3-4b02-0585-3f1a-c1b7c090c4b6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1533.029616] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1533.029867] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1533.030056] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Deleting the datastore file [datastore2] 79c92a1b-20ef-4360-93b4-913cbfcf92fe {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1533.030315] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-5c82cdc2-6b78-48da-a78c-0c74e4097268 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1533.036317] env[68617]: DEBUG oslo_vmware.api [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Waiting for the task: (returnval){ [ 1533.036317] env[68617]: value = "task-3470834" [ 1533.036317] env[68617]: _type = "Task" [ 1533.036317] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1533.045933] env[68617]: DEBUG oslo_vmware.api [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Task: {'id': task-3470834, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1533.464711] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1533.465121] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Creating directory with path [datastore2] vmware_temp/215b96e7-b230-446f-88de-4ce754d9b571/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1533.465199] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-aed98f2b-33af-423c-b3ce-609e800fc9ef {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1533.475957] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Created directory with path [datastore2] vmware_temp/215b96e7-b230-446f-88de-4ce754d9b571/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1533.476152] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Fetch image to [datastore2] vmware_temp/215b96e7-b230-446f-88de-4ce754d9b571/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1533.476322] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/215b96e7-b230-446f-88de-4ce754d9b571/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1533.477013] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47742710-7cfe-4875-8219-fbc9667bfcd7 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1533.483131] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6686ea1-7503-40bc-92e0-36d103cf9253 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1533.491664] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61f2f2a1-dd46-4047-bc6f-538cbc030e0f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1533.521633] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3879fa9-cee0-4299-9828-e1abdba819d1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1533.527325] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-3ffe4eb1-0bbd-4da7-837a-538da9647b75 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1533.544858] env[68617]: DEBUG oslo_vmware.api [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Task: {'id': task-3470834, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.060766} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1533.545124] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1533.545310] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1533.545495] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1533.545680] env[68617]: INFO nova.compute.manager [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1533.548701] env[68617]: DEBUG nova.compute.claims [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1533.548976] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1533.549328] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1533.555088] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1533.619517] env[68617]: DEBUG oslo_vmware.rw_handles [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/215b96e7-b230-446f-88de-4ce754d9b571/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1533.682684] env[68617]: DEBUG oslo_vmware.rw_handles [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1533.682880] env[68617]: DEBUG oslo_vmware.rw_handles [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/215b96e7-b230-446f-88de-4ce754d9b571/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1533.866782] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0442d2d-cb94-4dac-ab46-b442b206b579 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1533.874389] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc99ac8e-47f8-4f9a-80ff-ec275a25b941 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1533.906189] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3f8d44e-078c-4142-a70a-3e0448acf295 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1533.913165] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd4defca-1c85-4ca2-960f-ed330d4a88b6 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1533.926026] env[68617]: DEBUG nova.compute.provider_tree [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1533.934336] env[68617]: DEBUG nova.scheduler.client.report [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1533.947996] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.399s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1533.948539] env[68617]: ERROR nova.compute.manager [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1533.948539] env[68617]: Faults: ['InvalidArgument'] [ 1533.948539] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Traceback (most recent call last): [ 1533.948539] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1533.948539] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] self.driver.spawn(context, instance, image_meta, [ 1533.948539] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1533.948539] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1533.948539] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1533.948539] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] self._fetch_image_if_missing(context, vi) [ 1533.948539] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1533.948539] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] image_cache(vi, tmp_image_ds_loc) [ 1533.948539] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1533.948914] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] vm_util.copy_virtual_disk( [ 1533.948914] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1533.948914] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] session._wait_for_task(vmdk_copy_task) [ 1533.948914] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1533.948914] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] return self.wait_for_task(task_ref) [ 1533.948914] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1533.948914] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] return evt.wait() [ 1533.948914] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1533.948914] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] result = hub.switch() [ 1533.948914] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1533.948914] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] return self.greenlet.switch() [ 1533.948914] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1533.948914] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] self.f(*self.args, **self.kw) [ 1533.949262] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1533.949262] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] raise exceptions.translate_fault(task_info.error) [ 1533.949262] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1533.949262] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Faults: ['InvalidArgument'] [ 1533.949262] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] [ 1533.949262] env[68617]: DEBUG nova.compute.utils [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1533.950744] env[68617]: DEBUG nova.compute.manager [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Build of instance 79c92a1b-20ef-4360-93b4-913cbfcf92fe was re-scheduled: A specified parameter was not correct: fileType [ 1533.950744] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1533.951138] env[68617]: DEBUG nova.compute.manager [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1533.951312] env[68617]: DEBUG nova.compute.manager [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1533.951465] env[68617]: DEBUG nova.compute.manager [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1533.951627] env[68617]: DEBUG nova.network.neutron [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1534.203008] env[68617]: DEBUG neutronclient.v2_0.client [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=68617) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1534.204095] env[68617]: ERROR nova.compute.manager [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1534.204095] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Traceback (most recent call last): [ 1534.204095] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1534.204095] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] self.driver.spawn(context, instance, image_meta, [ 1534.204095] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1534.204095] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1534.204095] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1534.204095] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] self._fetch_image_if_missing(context, vi) [ 1534.204095] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1534.204095] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] image_cache(vi, tmp_image_ds_loc) [ 1534.204095] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1534.204095] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] vm_util.copy_virtual_disk( [ 1534.204412] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1534.204412] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] session._wait_for_task(vmdk_copy_task) [ 1534.204412] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1534.204412] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] return self.wait_for_task(task_ref) [ 1534.204412] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1534.204412] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] return evt.wait() [ 1534.204412] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1534.204412] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] result = hub.switch() [ 1534.204412] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1534.204412] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] return self.greenlet.switch() [ 1534.204412] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1534.204412] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] self.f(*self.args, **self.kw) [ 1534.204412] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1534.204722] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] raise exceptions.translate_fault(task_info.error) [ 1534.204722] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1534.204722] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Faults: ['InvalidArgument'] [ 1534.204722] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] [ 1534.204722] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] During handling of the above exception, another exception occurred: [ 1534.204722] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] [ 1534.204722] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Traceback (most recent call last): [ 1534.204722] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/compute/manager.py", line 2430, in _do_build_and_run_instance [ 1534.204722] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] self._build_and_run_instance(context, instance, image, [ 1534.204722] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/compute/manager.py", line 2722, in _build_and_run_instance [ 1534.204722] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] raise exception.RescheduledException( [ 1534.204722] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] nova.exception.RescheduledException: Build of instance 79c92a1b-20ef-4360-93b4-913cbfcf92fe was re-scheduled: A specified parameter was not correct: fileType [ 1534.204722] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Faults: ['InvalidArgument'] [ 1534.204722] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] [ 1534.205069] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] During handling of the above exception, another exception occurred: [ 1534.205069] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] [ 1534.205069] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Traceback (most recent call last): [ 1534.205069] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1534.205069] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] ret = obj(*args, **kwargs) [ 1534.205069] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1534.205069] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] exception_handler_v20(status_code, error_body) [ 1534.205069] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1534.205069] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] raise client_exc(message=error_message, [ 1534.205069] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1534.205069] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Neutron server returns request_ids: ['req-3916d004-113d-422e-9424-6f40b1d333f7'] [ 1534.205069] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] [ 1534.205069] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] During handling of the above exception, another exception occurred: [ 1534.205372] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] [ 1534.205372] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Traceback (most recent call last): [ 1534.205372] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/compute/manager.py", line 3019, in _cleanup_allocated_networks [ 1534.205372] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] self._deallocate_network(context, instance, requested_networks) [ 1534.205372] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1534.205372] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] self.network_api.deallocate_for_instance( [ 1534.205372] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1534.205372] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] data = neutron.list_ports(**search_opts) [ 1534.205372] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1534.205372] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] ret = obj(*args, **kwargs) [ 1534.205372] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1534.205372] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] return self.list('ports', self.ports_path, retrieve_all, [ 1534.205372] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1534.205660] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] ret = obj(*args, **kwargs) [ 1534.205660] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1534.205660] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] for r in self._pagination(collection, path, **params): [ 1534.205660] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1534.205660] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] res = self.get(path, params=params) [ 1534.205660] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1534.205660] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] ret = obj(*args, **kwargs) [ 1534.205660] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1534.205660] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] return self.retry_request("GET", action, body=body, [ 1534.205660] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1534.205660] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] ret = obj(*args, **kwargs) [ 1534.205660] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1534.205660] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] return self.do_request(method, action, body=body, [ 1534.205951] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1534.205951] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] ret = obj(*args, **kwargs) [ 1534.205951] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1534.205951] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] self._handle_fault_response(status_code, replybody, resp) [ 1534.205951] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1534.205951] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] raise exception.Unauthorized() [ 1534.205951] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] nova.exception.Unauthorized: Not authorized. [ 1534.205951] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] [ 1534.260477] env[68617]: INFO nova.scheduler.client.report [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Deleted allocations for instance 79c92a1b-20ef-4360-93b4-913cbfcf92fe [ 1534.279978] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9f845ce4-85b2-41ff-bd28-c366672dccbb tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Lock "79c92a1b-20ef-4360-93b4-913cbfcf92fe" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 633.556s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1534.281024] env[68617]: DEBUG oslo_concurrency.lockutils [None req-442ecf20-674c-496c-a6f1-86f1a70c287c tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Lock "79c92a1b-20ef-4360-93b4-913cbfcf92fe" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 436.264s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1534.281259] env[68617]: DEBUG oslo_concurrency.lockutils [None req-442ecf20-674c-496c-a6f1-86f1a70c287c tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Acquiring lock "79c92a1b-20ef-4360-93b4-913cbfcf92fe-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1534.281434] env[68617]: DEBUG oslo_concurrency.lockutils [None req-442ecf20-674c-496c-a6f1-86f1a70c287c tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Lock "79c92a1b-20ef-4360-93b4-913cbfcf92fe-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1534.281627] env[68617]: DEBUG oslo_concurrency.lockutils [None req-442ecf20-674c-496c-a6f1-86f1a70c287c tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Lock "79c92a1b-20ef-4360-93b4-913cbfcf92fe-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1534.283561] env[68617]: INFO nova.compute.manager [None req-442ecf20-674c-496c-a6f1-86f1a70c287c tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Terminating instance [ 1534.285212] env[68617]: DEBUG nova.compute.manager [None req-442ecf20-674c-496c-a6f1-86f1a70c287c tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1534.285405] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-442ecf20-674c-496c-a6f1-86f1a70c287c tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1534.285871] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-3887dae9-fd1d-46bb-b1b0-77139e2ffe6a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1534.294951] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca6b9dd5-53b7-4400-ac5a-39dc9553ff60 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1534.305426] env[68617]: DEBUG nova.compute.manager [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1534.325842] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-442ecf20-674c-496c-a6f1-86f1a70c287c tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 79c92a1b-20ef-4360-93b4-913cbfcf92fe could not be found. [ 1534.326070] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-442ecf20-674c-496c-a6f1-86f1a70c287c tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1534.326252] env[68617]: INFO nova.compute.manager [None req-442ecf20-674c-496c-a6f1-86f1a70c287c tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1534.326491] env[68617]: DEBUG oslo.service.loopingcall [None req-442ecf20-674c-496c-a6f1-86f1a70c287c tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1534.326710] env[68617]: DEBUG nova.compute.manager [-] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1534.326807] env[68617]: DEBUG nova.network.neutron [-] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1534.355317] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1534.355953] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1534.357087] env[68617]: INFO nova.compute.claims [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1534.453847] env[68617]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=68617) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1534.454177] env[68617]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1534.454639] env[68617]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1534.454639] env[68617]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1534.454639] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1534.454639] env[68617]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1534.454639] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1534.454639] env[68617]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1534.454639] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1534.454639] env[68617]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1534.454639] env[68617]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1534.454639] env[68617]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-275aab86-8c5a-436b-8be0-b2077af74858'] [ 1534.454639] env[68617]: ERROR oslo.service.loopingcall [ 1534.454639] env[68617]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1534.454639] env[68617]: ERROR oslo.service.loopingcall [ 1534.454639] env[68617]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1534.454639] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1534.454639] env[68617]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1534.455087] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1534.455087] env[68617]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1534.455087] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1534.455087] env[68617]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1534.455087] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1534.455087] env[68617]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1534.455087] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1534.455087] env[68617]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1534.455087] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1534.455087] env[68617]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1534.455087] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1534.455087] env[68617]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1534.455087] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1534.455087] env[68617]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1534.455087] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1534.455087] env[68617]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1534.455087] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1534.455087] env[68617]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1534.455483] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1534.455483] env[68617]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1534.455483] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1534.455483] env[68617]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1534.455483] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1534.455483] env[68617]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1534.455483] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1534.455483] env[68617]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1534.455483] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1534.455483] env[68617]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1534.455483] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1534.455483] env[68617]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1534.455483] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1534.455483] env[68617]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1534.455483] env[68617]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1534.455483] env[68617]: ERROR oslo.service.loopingcall [ 1534.455871] env[68617]: ERROR nova.compute.manager [None req-442ecf20-674c-496c-a6f1-86f1a70c287c tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1534.485307] env[68617]: ERROR nova.compute.manager [None req-442ecf20-674c-496c-a6f1-86f1a70c287c tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1534.485307] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Traceback (most recent call last): [ 1534.485307] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1534.485307] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] ret = obj(*args, **kwargs) [ 1534.485307] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1534.485307] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] exception_handler_v20(status_code, error_body) [ 1534.485307] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1534.485307] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] raise client_exc(message=error_message, [ 1534.485307] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1534.485307] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Neutron server returns request_ids: ['req-275aab86-8c5a-436b-8be0-b2077af74858'] [ 1534.485307] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] [ 1534.485825] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] During handling of the above exception, another exception occurred: [ 1534.485825] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] [ 1534.485825] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Traceback (most recent call last): [ 1534.485825] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1534.485825] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] self._delete_instance(context, instance, bdms) [ 1534.485825] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1534.485825] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] self._shutdown_instance(context, instance, bdms) [ 1534.485825] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1534.485825] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] self._try_deallocate_network(context, instance, requested_networks) [ 1534.485825] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1534.485825] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] with excutils.save_and_reraise_exception(): [ 1534.485825] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1534.485825] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] self.force_reraise() [ 1534.486203] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1534.486203] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] raise self.value [ 1534.486203] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1534.486203] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] _deallocate_network_with_retries() [ 1534.486203] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1534.486203] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] return evt.wait() [ 1534.486203] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1534.486203] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] result = hub.switch() [ 1534.486203] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1534.486203] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] return self.greenlet.switch() [ 1534.486203] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1534.486203] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] result = func(*self.args, **self.kw) [ 1534.487564] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1534.487564] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] result = f(*args, **kwargs) [ 1534.487564] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1534.487564] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] self._deallocate_network( [ 1534.487564] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1534.487564] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] self.network_api.deallocate_for_instance( [ 1534.487564] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1534.487564] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] data = neutron.list_ports(**search_opts) [ 1534.487564] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1534.487564] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] ret = obj(*args, **kwargs) [ 1534.487564] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1534.487564] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] return self.list('ports', self.ports_path, retrieve_all, [ 1534.487564] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1534.488244] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] ret = obj(*args, **kwargs) [ 1534.488244] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1534.488244] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] for r in self._pagination(collection, path, **params): [ 1534.488244] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1534.488244] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] res = self.get(path, params=params) [ 1534.488244] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1534.488244] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] ret = obj(*args, **kwargs) [ 1534.488244] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1534.488244] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] return self.retry_request("GET", action, body=body, [ 1534.488244] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1534.488244] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] ret = obj(*args, **kwargs) [ 1534.488244] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1534.488244] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] return self.do_request(method, action, body=body, [ 1534.488582] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1534.488582] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] ret = obj(*args, **kwargs) [ 1534.488582] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1534.488582] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] self._handle_fault_response(status_code, replybody, resp) [ 1534.488582] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1534.488582] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1534.488582] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1534.488582] env[68617]: ERROR nova.compute.manager [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] [ 1534.521988] env[68617]: DEBUG oslo_concurrency.lockutils [None req-442ecf20-674c-496c-a6f1-86f1a70c287c tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Lock "79c92a1b-20ef-4360-93b4-913cbfcf92fe" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.241s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1534.525309] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "79c92a1b-20ef-4360-93b4-913cbfcf92fe" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 244.984s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1534.525499] env[68617]: INFO nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] During sync_power_state the instance has a pending task (deleting). Skip. [ 1534.525752] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "79c92a1b-20ef-4360-93b4-913cbfcf92fe" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1534.569422] env[68617]: INFO nova.compute.manager [None req-442ecf20-674c-496c-a6f1-86f1a70c287c tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] [instance: 79c92a1b-20ef-4360-93b4-913cbfcf92fe] Successfully reverted task state from None on failure for instance. [ 1534.575298] env[68617]: ERROR oslo_messaging.rpc.server [None req-442ecf20-674c-496c-a6f1-86f1a70c287c tempest-ListImageFiltersTestJSON-1602691307 tempest-ListImageFiltersTestJSON-1602691307-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1534.575298] env[68617]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1534.575298] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1534.575298] env[68617]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1534.575298] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1534.575298] env[68617]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1534.575298] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1534.575298] env[68617]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1534.575298] env[68617]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1534.575298] env[68617]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-275aab86-8c5a-436b-8be0-b2077af74858'] [ 1534.575298] env[68617]: ERROR oslo_messaging.rpc.server [ 1534.575298] env[68617]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1534.575298] env[68617]: ERROR oslo_messaging.rpc.server [ 1534.575298] env[68617]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1534.575298] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1534.575783] env[68617]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1534.575783] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1534.575783] env[68617]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1534.575783] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1534.575783] env[68617]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1534.575783] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1534.575783] env[68617]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1534.575783] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1534.575783] env[68617]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1534.575783] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1534.575783] env[68617]: ERROR oslo_messaging.rpc.server raise self.value [ 1534.575783] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1534.575783] env[68617]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1534.575783] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1534.575783] env[68617]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1534.575783] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1534.575783] env[68617]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1534.575783] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1534.576272] env[68617]: ERROR oslo_messaging.rpc.server raise self.value [ 1534.576272] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1534.576272] env[68617]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1534.576272] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1534.576272] env[68617]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1534.576272] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1534.576272] env[68617]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1534.576272] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1534.576272] env[68617]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1534.576272] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1534.576272] env[68617]: ERROR oslo_messaging.rpc.server raise self.value [ 1534.576272] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1534.576272] env[68617]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1534.576272] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1534.576272] env[68617]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1534.576272] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1534.576272] env[68617]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1534.576272] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1534.576696] env[68617]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1534.576696] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1534.576696] env[68617]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1534.576696] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1534.576696] env[68617]: ERROR oslo_messaging.rpc.server raise self.value [ 1534.576696] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1534.576696] env[68617]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1534.576696] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1534.576696] env[68617]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1534.576696] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1534.576696] env[68617]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1534.576696] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1534.576696] env[68617]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1534.576696] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1534.576696] env[68617]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1534.576696] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1534.576696] env[68617]: ERROR oslo_messaging.rpc.server raise self.value [ 1534.576696] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1534.577137] env[68617]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1534.577137] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1534.577137] env[68617]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1534.577137] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1534.577137] env[68617]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1534.577137] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1534.577137] env[68617]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1534.577137] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1534.577137] env[68617]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1534.577137] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1534.577137] env[68617]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1534.577137] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1534.577137] env[68617]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1534.577137] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1534.577137] env[68617]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1534.577137] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1534.577137] env[68617]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1534.577137] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1534.577589] env[68617]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1534.577589] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1534.577589] env[68617]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1534.577589] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1534.577589] env[68617]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1534.577589] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1534.577589] env[68617]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1534.577589] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1534.577589] env[68617]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1534.577589] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1534.577589] env[68617]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1534.577589] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1534.577589] env[68617]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1534.577589] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1534.577589] env[68617]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1534.577589] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1534.577589] env[68617]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1534.577589] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1534.577995] env[68617]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1534.577995] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1534.577995] env[68617]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1534.577995] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1534.577995] env[68617]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1534.577995] env[68617]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1534.577995] env[68617]: ERROR oslo_messaging.rpc.server [ 1534.627818] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35a0f0ef-60e0-46a3-8cd9-66d9f3e82fe9 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1534.636276] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd32ddb3-7730-4b03-b96a-945cc87e8f95 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1534.667439] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f2bf2d4-1f39-4a46-b072-9694b6a37b7a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1534.674761] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c728f93d-98f2-4120-9c85-693fae7aa4b3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1534.688054] env[68617]: DEBUG nova.compute.provider_tree [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1534.718410] env[68617]: DEBUG nova.scheduler.client.report [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1534.732438] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.377s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1534.732924] env[68617]: DEBUG nova.compute.manager [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1534.772747] env[68617]: DEBUG nova.compute.utils [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1534.774188] env[68617]: DEBUG nova.compute.manager [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1534.774375] env[68617]: DEBUG nova.network.neutron [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1534.790299] env[68617]: DEBUG nova.compute.manager [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1534.845166] env[68617]: DEBUG nova.policy [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5eedbc9d986e444d88090b25dd4499e9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bc7486cad82a4c5e8f23a4012ee166d2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 1534.858645] env[68617]: DEBUG nova.compute.manager [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1534.890042] env[68617]: DEBUG nova.virt.hardware [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1534.890335] env[68617]: DEBUG nova.virt.hardware [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1534.890495] env[68617]: DEBUG nova.virt.hardware [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1534.890669] env[68617]: DEBUG nova.virt.hardware [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1534.890816] env[68617]: DEBUG nova.virt.hardware [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1534.890958] env[68617]: DEBUG nova.virt.hardware [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1534.891183] env[68617]: DEBUG nova.virt.hardware [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1534.891340] env[68617]: DEBUG nova.virt.hardware [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1534.891502] env[68617]: DEBUG nova.virt.hardware [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1534.891661] env[68617]: DEBUG nova.virt.hardware [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1534.891830] env[68617]: DEBUG nova.virt.hardware [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1534.892762] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbc7209f-b403-4ae3-84b9-7e74a025008a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1534.901888] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3dad1faa-bab1-44f2-abff-a52332ca6ba5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1535.148600] env[68617]: DEBUG nova.network.neutron [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Successfully created port: 117bd600-fd08-4314-95d1-1e37a2864bc8 {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1535.651183] env[68617]: DEBUG nova.compute.manager [req-72a94a58-8a12-405a-a73f-ffdbf12e75a1 req-09c72b50-6a50-4790-b611-f3af9a882a3c service nova] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Received event network-vif-plugged-117bd600-fd08-4314-95d1-1e37a2864bc8 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1535.651183] env[68617]: DEBUG oslo_concurrency.lockutils [req-72a94a58-8a12-405a-a73f-ffdbf12e75a1 req-09c72b50-6a50-4790-b611-f3af9a882a3c service nova] Acquiring lock "6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1535.651183] env[68617]: DEBUG oslo_concurrency.lockutils [req-72a94a58-8a12-405a-a73f-ffdbf12e75a1 req-09c72b50-6a50-4790-b611-f3af9a882a3c service nova] Lock "6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1535.651183] env[68617]: DEBUG oslo_concurrency.lockutils [req-72a94a58-8a12-405a-a73f-ffdbf12e75a1 req-09c72b50-6a50-4790-b611-f3af9a882a3c service nova] Lock "6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1535.653145] env[68617]: DEBUG nova.compute.manager [req-72a94a58-8a12-405a-a73f-ffdbf12e75a1 req-09c72b50-6a50-4790-b611-f3af9a882a3c service nova] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] No waiting events found dispatching network-vif-plugged-117bd600-fd08-4314-95d1-1e37a2864bc8 {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1535.653145] env[68617]: WARNING nova.compute.manager [req-72a94a58-8a12-405a-a73f-ffdbf12e75a1 req-09c72b50-6a50-4790-b611-f3af9a882a3c service nova] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Received unexpected event network-vif-plugged-117bd600-fd08-4314-95d1-1e37a2864bc8 for instance with vm_state building and task_state spawning. [ 1535.765181] env[68617]: DEBUG nova.network.neutron [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Successfully updated port: 117bd600-fd08-4314-95d1-1e37a2864bc8 {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1535.778160] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Acquiring lock "refresh_cache-6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1535.778388] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Acquired lock "refresh_cache-6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1535.778622] env[68617]: DEBUG nova.network.neutron [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1535.819551] env[68617]: DEBUG nova.network.neutron [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1536.001515] env[68617]: DEBUG nova.network.neutron [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Updating instance_info_cache with network_info: [{"id": "117bd600-fd08-4314-95d1-1e37a2864bc8", "address": "fa:16:3e:c8:39:66", "network": {"id": "afe65fda-c28a-44a6-a704-1019593fdab8", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1769052523-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bc7486cad82a4c5e8f23a4012ee166d2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dced2f3d-7fd3-4a42-836d-9f02dab4c949", "external-id": "nsx-vlan-transportzone-117", "segmentation_id": 117, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap117bd600-fd", "ovs_interfaceid": "117bd600-fd08-4314-95d1-1e37a2864bc8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1536.017178] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Releasing lock "refresh_cache-6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1536.017559] env[68617]: DEBUG nova.compute.manager [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Instance network_info: |[{"id": "117bd600-fd08-4314-95d1-1e37a2864bc8", "address": "fa:16:3e:c8:39:66", "network": {"id": "afe65fda-c28a-44a6-a704-1019593fdab8", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1769052523-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bc7486cad82a4c5e8f23a4012ee166d2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dced2f3d-7fd3-4a42-836d-9f02dab4c949", "external-id": "nsx-vlan-transportzone-117", "segmentation_id": 117, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap117bd600-fd", "ovs_interfaceid": "117bd600-fd08-4314-95d1-1e37a2864bc8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1536.017985] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c8:39:66', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'dced2f3d-7fd3-4a42-836d-9f02dab4c949', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '117bd600-fd08-4314-95d1-1e37a2864bc8', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1536.025642] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Creating folder: Project (bc7486cad82a4c5e8f23a4012ee166d2). Parent ref: group-v693691. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1536.026223] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9f2b2d9c-6843-452e-80ef-52fef2836c6c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1536.036969] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Created folder: Project (bc7486cad82a4c5e8f23a4012ee166d2) in parent group-v693691. [ 1536.037213] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Creating folder: Instances. Parent ref: group-v693774. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1536.037516] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d1f25334-d063-48dd-a241-0b4319988408 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1536.045096] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Created folder: Instances in parent group-v693774. [ 1536.045381] env[68617]: DEBUG oslo.service.loopingcall [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1536.045519] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1536.045733] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-201f2315-03b0-4a64-877d-a4cec30e5575 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1536.064629] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1536.064629] env[68617]: value = "task-3470837" [ 1536.064629] env[68617]: _type = "Task" [ 1536.064629] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1536.073680] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470837, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1536.574907] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470837, 'name': CreateVM_Task, 'duration_secs': 0.289458} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1536.575171] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1536.575942] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1536.576118] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1536.576454] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1536.576696] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-85c164b5-9aca-485c-98e2-d8e8f5750a17 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1536.581147] env[68617]: DEBUG oslo_vmware.api [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Waiting for the task: (returnval){ [ 1536.581147] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52c92691-f0cd-0cdd-70b7-cfea51b57a70" [ 1536.581147] env[68617]: _type = "Task" [ 1536.581147] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1536.588537] env[68617]: DEBUG oslo_vmware.api [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52c92691-f0cd-0cdd-70b7-cfea51b57a70, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1537.098146] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1537.098407] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1537.098613] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1537.661415] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d2d19b3a-6461-4472-b37a-867a0762c7fc tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Acquiring lock "6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1537.734684] env[68617]: DEBUG nova.compute.manager [req-4da7ddd6-ca55-42a9-b005-751cc01da2d0 req-a77fb930-3d86-4d68-8057-cb1dfd4f532a service nova] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Received event network-changed-117bd600-fd08-4314-95d1-1e37a2864bc8 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1537.734684] env[68617]: DEBUG nova.compute.manager [req-4da7ddd6-ca55-42a9-b005-751cc01da2d0 req-a77fb930-3d86-4d68-8057-cb1dfd4f532a service nova] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Refreshing instance network info cache due to event network-changed-117bd600-fd08-4314-95d1-1e37a2864bc8. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1537.734684] env[68617]: DEBUG oslo_concurrency.lockutils [req-4da7ddd6-ca55-42a9-b005-751cc01da2d0 req-a77fb930-3d86-4d68-8057-cb1dfd4f532a service nova] Acquiring lock "refresh_cache-6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1537.734684] env[68617]: DEBUG oslo_concurrency.lockutils [req-4da7ddd6-ca55-42a9-b005-751cc01da2d0 req-a77fb930-3d86-4d68-8057-cb1dfd4f532a service nova] Acquired lock "refresh_cache-6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1537.734684] env[68617]: DEBUG nova.network.neutron [req-4da7ddd6-ca55-42a9-b005-751cc01da2d0 req-a77fb930-3d86-4d68-8057-cb1dfd4f532a service nova] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Refreshing network info cache for port 117bd600-fd08-4314-95d1-1e37a2864bc8 {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1538.077925] env[68617]: DEBUG nova.network.neutron [req-4da7ddd6-ca55-42a9-b005-751cc01da2d0 req-a77fb930-3d86-4d68-8057-cb1dfd4f532a service nova] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Updated VIF entry in instance network info cache for port 117bd600-fd08-4314-95d1-1e37a2864bc8. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1538.078890] env[68617]: DEBUG nova.network.neutron [req-4da7ddd6-ca55-42a9-b005-751cc01da2d0 req-a77fb930-3d86-4d68-8057-cb1dfd4f532a service nova] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Updating instance_info_cache with network_info: [{"id": "117bd600-fd08-4314-95d1-1e37a2864bc8", "address": "fa:16:3e:c8:39:66", "network": {"id": "afe65fda-c28a-44a6-a704-1019593fdab8", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1769052523-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bc7486cad82a4c5e8f23a4012ee166d2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dced2f3d-7fd3-4a42-836d-9f02dab4c949", "external-id": "nsx-vlan-transportzone-117", "segmentation_id": 117, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap117bd600-fd", "ovs_interfaceid": "117bd600-fd08-4314-95d1-1e37a2864bc8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1538.088399] env[68617]: DEBUG oslo_concurrency.lockutils [req-4da7ddd6-ca55-42a9-b005-751cc01da2d0 req-a77fb930-3d86-4d68-8057-cb1dfd4f532a service nova] Releasing lock "refresh_cache-6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1539.319890] env[68617]: DEBUG oslo_concurrency.lockutils [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "f54002b0-d60e-44ff-82a5-ef2f5193c48c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1539.320201] env[68617]: DEBUG oslo_concurrency.lockutils [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "f54002b0-d60e-44ff-82a5-ef2f5193c48c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1542.240050] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Acquiring lock "2c950cba-7698-48e0-8852-bf569f58f967" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1542.240720] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Lock "2c950cba-7698-48e0-8852-bf569f58f967" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1543.699029] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1543.699367] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Cleaning up deleted instances {{(pid=68617) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1543.710505] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] There are 0 instances to clean {{(pid=68617) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1549.117548] env[68617]: DEBUG oslo_concurrency.lockutils [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquiring lock "12ed2a40-3d74-49a2-95b4-ccaaf58c8060" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1549.117838] env[68617]: DEBUG oslo_concurrency.lockutils [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Lock "12ed2a40-3d74-49a2-95b4-ccaaf58c8060" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1549.698997] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1549.698997] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Starting heal instance info cache {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1549.698997] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Rebuilding the list of instances to heal {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1549.720260] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1549.720260] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1549.720260] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1549.720260] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1549.720446] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1549.720446] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1549.720564] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1549.720676] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1549.720786] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1549.722022] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1549.722022] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Didn't find any instances for network info cache update. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1549.722022] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1550.710448] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1550.710448] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Cleaning up deleted instances with incomplete migration {{(pid=68617) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1552.710510] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1552.710787] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1552.710914] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1554.700283] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1554.700676] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1554.700676] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68617) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1557.694763] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1557.700297] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1557.700297] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1557.718043] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1557.718043] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1557.718043] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1557.718043] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1557.718686] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-488d3c1f-8181-4f97-9e2f-5ed3b1bffe92 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1557.737421] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6f50ed6-4183-49c3-a053-cfe4d37de6f7 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1557.750576] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-549d8acb-fd9a-4fc6-931d-a62135424f21 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1557.758117] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3cb4a4e0-aee2-409e-b791-0d16a18fd337 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1557.792891] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180925MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1557.793475] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1557.793809] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1557.958324] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 1cc42c7f-8781-40b0-9f75-edfef3bc90e7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1557.958552] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance d46ca6f3-0ee9-412c-98b4-f639ce4f9228 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1557.959187] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance a8ff6232-530c-453a-96e4-f8ce00f976e3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1557.959187] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1557.959187] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance e90877a8-47d3-47d7-8362-5bcfe3a98c36 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1557.959187] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f03b9bc5-9438-4c0c-b595-72c631bece08 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1557.959355] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance ee6efd93-25be-4268-afe9-ba39e543a4fb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1557.959355] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 1605028f-5d6d-4ac4-8416-c0465982c53a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1557.959463] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance fc1043b8-535d-4af0-b92b-1f43580cdc9a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1557.959602] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1557.976255] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 57cdcf44-576a-4343-9277-4b9ebb2b194a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1557.994337] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f54002b0-d60e-44ff-82a5-ef2f5193c48c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1558.005079] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 2c950cba-7698-48e0-8852-bf569f58f967 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1558.019856] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 12ed2a40-3d74-49a2-95b4-ccaaf58c8060 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1558.019856] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1558.019856] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1558.036676] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Refreshing inventories for resource provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1558.053328] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Updating ProviderTree inventory for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1558.053527] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Updating inventory in ProviderTree for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1558.067924] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Refreshing aggregate associations for resource provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f, aggregates: None {{(pid=68617) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1558.089075] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Refreshing trait associations for resource provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_VMDK {{(pid=68617) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1558.286690] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Acquiring lock "21d0560a-fde3-4c16-b2fc-06d6f8668a7a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1558.287067] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Lock "21d0560a-fde3-4c16-b2fc-06d6f8668a7a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1558.310585] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59e3f0d8-b9da-450a-85b5-455838c81496 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1558.323015] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dff402ea-0f5b-4375-8cfd-da4113b3a2d7 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1558.360603] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b9aebd6-ccde-406f-ba00-41b9c5f90799 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1558.369804] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55aa183b-8dd6-4c8a-9ee9-f31fec38121f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1558.386424] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1558.396776] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1558.411564] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1558.411564] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.617s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1562.238895] env[68617]: DEBUG oslo_concurrency.lockutils [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Acquiring lock "902b5ab9-23b8-450f-853a-b2da889c3afd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1562.239204] env[68617]: DEBUG oslo_concurrency.lockutils [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Lock "902b5ab9-23b8-450f-853a-b2da889c3afd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1579.924354] env[68617]: WARNING oslo_vmware.rw_handles [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1579.924354] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1579.924354] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1579.924354] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1579.924354] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1579.924354] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 1579.924354] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1579.924354] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1579.924354] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1579.924354] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1579.924354] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1579.924354] env[68617]: ERROR oslo_vmware.rw_handles [ 1579.924956] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/215b96e7-b230-446f-88de-4ce754d9b571/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1579.926899] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1579.927162] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Copying Virtual Disk [datastore2] vmware_temp/215b96e7-b230-446f-88de-4ce754d9b571/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/215b96e7-b230-446f-88de-4ce754d9b571/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1579.927452] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-3b6a0c5b-2bda-4085-9882-17b30d38be68 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1579.935981] env[68617]: DEBUG oslo_vmware.api [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Waiting for the task: (returnval){ [ 1579.935981] env[68617]: value = "task-3470838" [ 1579.935981] env[68617]: _type = "Task" [ 1579.935981] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1579.943976] env[68617]: DEBUG oslo_vmware.api [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Task: {'id': task-3470838, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1580.447779] env[68617]: DEBUG oslo_vmware.exceptions [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1580.448084] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1580.448634] env[68617]: ERROR nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1580.448634] env[68617]: Faults: ['InvalidArgument'] [ 1580.448634] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Traceback (most recent call last): [ 1580.448634] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1580.448634] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] yield resources [ 1580.448634] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1580.448634] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] self.driver.spawn(context, instance, image_meta, [ 1580.448634] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1580.448634] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1580.448634] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1580.448634] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] self._fetch_image_if_missing(context, vi) [ 1580.448634] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1580.448959] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] image_cache(vi, tmp_image_ds_loc) [ 1580.448959] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1580.448959] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] vm_util.copy_virtual_disk( [ 1580.448959] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1580.448959] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] session._wait_for_task(vmdk_copy_task) [ 1580.448959] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1580.448959] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] return self.wait_for_task(task_ref) [ 1580.448959] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1580.448959] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] return evt.wait() [ 1580.448959] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1580.448959] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] result = hub.switch() [ 1580.448959] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1580.448959] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] return self.greenlet.switch() [ 1580.449424] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1580.449424] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] self.f(*self.args, **self.kw) [ 1580.449424] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1580.449424] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] raise exceptions.translate_fault(task_info.error) [ 1580.449424] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1580.449424] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Faults: ['InvalidArgument'] [ 1580.449424] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] [ 1580.449424] env[68617]: INFO nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Terminating instance [ 1580.450516] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1580.450725] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1580.450959] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-01ca8097-da2b-4251-8c09-eefd647b94ee {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1580.453087] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1580.453284] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1580.453987] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd161801-9419-4c5a-a106-a2572e6f7b69 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1580.460840] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1580.461066] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-bcfed32e-b181-48f4-a681-27d8d971515b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1580.463165] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1580.463335] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1580.464289] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fba7fb3f-8499-4a38-a09a-8b19b0480780 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1580.469129] env[68617]: DEBUG oslo_vmware.api [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Waiting for the task: (returnval){ [ 1580.469129] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]5220e1d5-acac-9e09-f5e4-bcf790bd2191" [ 1580.469129] env[68617]: _type = "Task" [ 1580.469129] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1580.480356] env[68617]: DEBUG oslo_vmware.api [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]5220e1d5-acac-9e09-f5e4-bcf790bd2191, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1580.531498] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1580.531751] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1580.532038] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Deleting the datastore file [datastore2] 1cc42c7f-8781-40b0-9f75-edfef3bc90e7 {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1580.532308] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-05943f1a-ff5c-4e58-82d8-e34830e4eccc {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1580.538930] env[68617]: DEBUG oslo_vmware.api [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Waiting for the task: (returnval){ [ 1580.538930] env[68617]: value = "task-3470840" [ 1580.538930] env[68617]: _type = "Task" [ 1580.538930] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1580.546963] env[68617]: DEBUG oslo_vmware.api [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Task: {'id': task-3470840, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1580.980642] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1580.980642] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Creating directory with path [datastore2] vmware_temp/a4bb14cf-506e-4695-a81a-e038c4195f15/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1580.981213] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fa957ccf-e343-4968-b515-c42878f53618 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1580.991500] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Created directory with path [datastore2] vmware_temp/a4bb14cf-506e-4695-a81a-e038c4195f15/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1580.991697] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Fetch image to [datastore2] vmware_temp/a4bb14cf-506e-4695-a81a-e038c4195f15/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1580.991899] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/a4bb14cf-506e-4695-a81a-e038c4195f15/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1580.992642] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6672b3fa-936a-4b55-bd2e-ca10fb12028f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1580.999156] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a57f955d-99c5-4e30-8a70-36aa5ea643bd {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1581.008118] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f6332bf-e0a5-45de-8f20-a016a443d5c9 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1581.039599] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aea7837b-7927-4015-ba43-6fc353d8605a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1581.050598] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1ade5679-9bc2-4950-906b-fdd4bbea0874 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1581.052291] env[68617]: DEBUG oslo_vmware.api [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Task: {'id': task-3470840, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.078657} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1581.052525] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1581.052705] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1581.052892] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1581.053104] env[68617]: INFO nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1581.055250] env[68617]: DEBUG nova.compute.claims [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1581.055440] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1581.055650] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1581.078146] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1581.138767] env[68617]: DEBUG oslo_vmware.rw_handles [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a4bb14cf-506e-4695-a81a-e038c4195f15/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1581.197555] env[68617]: DEBUG oslo_vmware.rw_handles [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1581.197749] env[68617]: DEBUG oslo_vmware.rw_handles [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a4bb14cf-506e-4695-a81a-e038c4195f15/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1581.321812] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d2694fe-1d5d-4948-8c50-40899bb4abca {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1581.328813] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83f56bf3-4305-4067-9efc-530159c8aa5a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1581.357699] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b3bcc28-7fed-4477-b59a-567f9b4ae961 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1581.365016] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7558f699-c08e-472e-acae-03e89c19c65c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1581.378256] env[68617]: DEBUG nova.compute.provider_tree [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1581.386820] env[68617]: DEBUG nova.scheduler.client.report [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1581.403150] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.347s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1581.403729] env[68617]: ERROR nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1581.403729] env[68617]: Faults: ['InvalidArgument'] [ 1581.403729] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Traceback (most recent call last): [ 1581.403729] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1581.403729] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] self.driver.spawn(context, instance, image_meta, [ 1581.403729] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1581.403729] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1581.403729] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1581.403729] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] self._fetch_image_if_missing(context, vi) [ 1581.403729] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1581.403729] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] image_cache(vi, tmp_image_ds_loc) [ 1581.403729] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1581.404075] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] vm_util.copy_virtual_disk( [ 1581.404075] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1581.404075] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] session._wait_for_task(vmdk_copy_task) [ 1581.404075] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1581.404075] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] return self.wait_for_task(task_ref) [ 1581.404075] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1581.404075] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] return evt.wait() [ 1581.404075] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1581.404075] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] result = hub.switch() [ 1581.404075] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1581.404075] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] return self.greenlet.switch() [ 1581.404075] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1581.404075] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] self.f(*self.args, **self.kw) [ 1581.404387] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1581.404387] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] raise exceptions.translate_fault(task_info.error) [ 1581.404387] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1581.404387] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Faults: ['InvalidArgument'] [ 1581.404387] env[68617]: ERROR nova.compute.manager [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] [ 1581.404505] env[68617]: DEBUG nova.compute.utils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1581.406018] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Build of instance 1cc42c7f-8781-40b0-9f75-edfef3bc90e7 was re-scheduled: A specified parameter was not correct: fileType [ 1581.406018] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1581.406397] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1581.406570] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1581.406738] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1581.406913] env[68617]: DEBUG nova.network.neutron [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1581.715199] env[68617]: DEBUG nova.network.neutron [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1581.729486] env[68617]: INFO nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Took 0.32 seconds to deallocate network for instance. [ 1581.833261] env[68617]: INFO nova.scheduler.client.report [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Deleted allocations for instance 1cc42c7f-8781-40b0-9f75-edfef3bc90e7 [ 1581.852470] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Lock "1cc42c7f-8781-40b0-9f75-edfef3bc90e7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 484.395s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1581.853584] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "1cc42c7f-8781-40b0-9f75-edfef3bc90e7" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 292.312s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1581.853770] env[68617]: INFO nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] During sync_power_state the instance has a pending task (spawning). Skip. [ 1581.853942] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "1cc42c7f-8781-40b0-9f75-edfef3bc90e7" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1581.854783] env[68617]: DEBUG oslo_concurrency.lockutils [None req-80ebfc67-2231-4d45-83cd-8491204e5755 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Lock "1cc42c7f-8781-40b0-9f75-edfef3bc90e7" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 287.765s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1581.854881] env[68617]: DEBUG oslo_concurrency.lockutils [None req-80ebfc67-2231-4d45-83cd-8491204e5755 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquiring lock "1cc42c7f-8781-40b0-9f75-edfef3bc90e7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1581.855062] env[68617]: DEBUG oslo_concurrency.lockutils [None req-80ebfc67-2231-4d45-83cd-8491204e5755 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Lock "1cc42c7f-8781-40b0-9f75-edfef3bc90e7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1581.855239] env[68617]: DEBUG oslo_concurrency.lockutils [None req-80ebfc67-2231-4d45-83cd-8491204e5755 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Lock "1cc42c7f-8781-40b0-9f75-edfef3bc90e7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1581.857400] env[68617]: INFO nova.compute.manager [None req-80ebfc67-2231-4d45-83cd-8491204e5755 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Terminating instance [ 1581.858872] env[68617]: DEBUG nova.compute.manager [None req-80ebfc67-2231-4d45-83cd-8491204e5755 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1581.859746] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-80ebfc67-2231-4d45-83cd-8491204e5755 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1581.859746] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-1308853a-0d36-4d8c-b427-8fb94ccdbd09 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1581.864779] env[68617]: DEBUG nova.compute.manager [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: a019d654-82ed-4ef2-850f-39a1f324566a] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1581.871334] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c41996a6-a07d-47c7-8ba5-bdfa0273f5aa {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1581.900074] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-80ebfc67-2231-4d45-83cd-8491204e5755 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 1cc42c7f-8781-40b0-9f75-edfef3bc90e7 could not be found. [ 1581.900279] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-80ebfc67-2231-4d45-83cd-8491204e5755 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1581.900451] env[68617]: INFO nova.compute.manager [None req-80ebfc67-2231-4d45-83cd-8491204e5755 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1581.900686] env[68617]: DEBUG oslo.service.loopingcall [None req-80ebfc67-2231-4d45-83cd-8491204e5755 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1581.901087] env[68617]: DEBUG nova.compute.manager [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: a019d654-82ed-4ef2-850f-39a1f324566a] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1581.901903] env[68617]: DEBUG nova.compute.manager [-] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1581.902016] env[68617]: DEBUG nova.network.neutron [-] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1581.921065] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Lock "a019d654-82ed-4ef2-850f-39a1f324566a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 240.628s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1581.925827] env[68617]: DEBUG nova.network.neutron [-] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1581.929493] env[68617]: DEBUG nova.compute.manager [None req-15b7b7d0-4e37-4a51-8b47-8f5c30bc73d8 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 43495abf-8f99-4f51-81ca-80a43c266695] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1581.935957] env[68617]: INFO nova.compute.manager [-] [instance: 1cc42c7f-8781-40b0-9f75-edfef3bc90e7] Took 0.03 seconds to deallocate network for instance. [ 1581.954667] env[68617]: DEBUG nova.compute.manager [None req-15b7b7d0-4e37-4a51-8b47-8f5c30bc73d8 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 43495abf-8f99-4f51-81ca-80a43c266695] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1581.979704] env[68617]: DEBUG oslo_concurrency.lockutils [None req-15b7b7d0-4e37-4a51-8b47-8f5c30bc73d8 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "43495abf-8f99-4f51-81ca-80a43c266695" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 239.510s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1581.988818] env[68617]: DEBUG nova.compute.manager [None req-bd9e7f4b-8a51-4064-946a-00dbae218b70 tempest-AttachVolumeTestJSON-339037198 tempest-AttachVolumeTestJSON-339037198-project-member] [instance: b9d0b85a-f0ac-4f9e-bec4-a82db0eb96c3] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1582.018230] env[68617]: DEBUG nova.compute.manager [None req-bd9e7f4b-8a51-4064-946a-00dbae218b70 tempest-AttachVolumeTestJSON-339037198 tempest-AttachVolumeTestJSON-339037198-project-member] [instance: b9d0b85a-f0ac-4f9e-bec4-a82db0eb96c3] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1582.030929] env[68617]: DEBUG oslo_concurrency.lockutils [None req-80ebfc67-2231-4d45-83cd-8491204e5755 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Lock "1cc42c7f-8781-40b0-9f75-edfef3bc90e7" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.176s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1582.040570] env[68617]: DEBUG oslo_concurrency.lockutils [None req-bd9e7f4b-8a51-4064-946a-00dbae218b70 tempest-AttachVolumeTestJSON-339037198 tempest-AttachVolumeTestJSON-339037198-project-member] Lock "b9d0b85a-f0ac-4f9e-bec4-a82db0eb96c3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 233.613s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1582.049848] env[68617]: DEBUG nova.compute.manager [None req-6be4c7fd-8e1f-4c7f-83a2-e9f158814247 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 5d294d66-266f-4a0b-be49-5061fb65b226] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1582.071681] env[68617]: DEBUG nova.compute.manager [None req-6be4c7fd-8e1f-4c7f-83a2-e9f158814247 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 5d294d66-266f-4a0b-be49-5061fb65b226] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1582.091160] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6be4c7fd-8e1f-4c7f-83a2-e9f158814247 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Lock "5d294d66-266f-4a0b-be49-5061fb65b226" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 229.604s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1582.101104] env[68617]: DEBUG nova.compute.manager [None req-18863ad0-b310-42ab-b5a2-bab78170f76b tempest-ImagesTestJSON-918330909 tempest-ImagesTestJSON-918330909-project-member] [instance: f8c0a514-7e7f-455a-b84d-9afc2957945c] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1582.124124] env[68617]: DEBUG nova.compute.manager [None req-18863ad0-b310-42ab-b5a2-bab78170f76b tempest-ImagesTestJSON-918330909 tempest-ImagesTestJSON-918330909-project-member] [instance: f8c0a514-7e7f-455a-b84d-9afc2957945c] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1582.146189] env[68617]: DEBUG oslo_concurrency.lockutils [None req-18863ad0-b310-42ab-b5a2-bab78170f76b tempest-ImagesTestJSON-918330909 tempest-ImagesTestJSON-918330909-project-member] Lock "f8c0a514-7e7f-455a-b84d-9afc2957945c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 228.719s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1582.155685] env[68617]: DEBUG nova.compute.manager [None req-c02501af-c000-48aa-ada4-d670a1fa0355 tempest-ServersV294TestFqdnHostnames-114980253 tempest-ServersV294TestFqdnHostnames-114980253-project-member] [instance: 9ca297f6-3239-48d3-9b67-dd1637a3bc25] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1582.180432] env[68617]: DEBUG nova.compute.manager [None req-c02501af-c000-48aa-ada4-d670a1fa0355 tempest-ServersV294TestFqdnHostnames-114980253 tempest-ServersV294TestFqdnHostnames-114980253-project-member] [instance: 9ca297f6-3239-48d3-9b67-dd1637a3bc25] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1582.201180] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c02501af-c000-48aa-ada4-d670a1fa0355 tempest-ServersV294TestFqdnHostnames-114980253 tempest-ServersV294TestFqdnHostnames-114980253-project-member] Lock "9ca297f6-3239-48d3-9b67-dd1637a3bc25" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 224.012s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1582.211025] env[68617]: DEBUG nova.compute.manager [None req-5a02a231-4f41-46be-a700-c796ffc4183b tempest-ServersNegativeTestMultiTenantJSON-1012065245 tempest-ServersNegativeTestMultiTenantJSON-1012065245-project-member] [instance: 57cdcf44-576a-4343-9277-4b9ebb2b194a] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1582.237019] env[68617]: DEBUG nova.compute.manager [None req-5a02a231-4f41-46be-a700-c796ffc4183b tempest-ServersNegativeTestMultiTenantJSON-1012065245 tempest-ServersNegativeTestMultiTenantJSON-1012065245-project-member] [instance: 57cdcf44-576a-4343-9277-4b9ebb2b194a] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1582.260308] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5a02a231-4f41-46be-a700-c796ffc4183b tempest-ServersNegativeTestMultiTenantJSON-1012065245 tempest-ServersNegativeTestMultiTenantJSON-1012065245-project-member] Lock "57cdcf44-576a-4343-9277-4b9ebb2b194a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 210.616s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1582.270157] env[68617]: DEBUG nova.compute.manager [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1582.315356] env[68617]: DEBUG oslo_concurrency.lockutils [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1582.315640] env[68617]: DEBUG oslo_concurrency.lockutils [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1582.317126] env[68617]: INFO nova.compute.claims [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1582.513694] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9a70edd-52b8-4379-a66d-92b3b5b1a26c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1582.521652] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce14d96d-3944-4b96-ae53-483517056ade {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1582.551989] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03801e76-86bd-400f-8336-b562c8f2c27b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1582.559198] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e24672fb-c675-4a66-9906-9a4a97e36681 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1582.571780] env[68617]: DEBUG nova.compute.provider_tree [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1582.580129] env[68617]: DEBUG nova.scheduler.client.report [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1582.594102] env[68617]: DEBUG oslo_concurrency.lockutils [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.278s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1582.595046] env[68617]: DEBUG nova.compute.manager [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1582.624051] env[68617]: DEBUG nova.compute.utils [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1582.625712] env[68617]: DEBUG nova.compute.manager [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1582.625930] env[68617]: DEBUG nova.network.neutron [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1582.636207] env[68617]: DEBUG nova.compute.manager [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1582.690041] env[68617]: DEBUG nova.policy [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '11eecc8f059e410cb97bafaadc378f89', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4de7b27e9cf04c16b8dee80e756404fd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 1582.714686] env[68617]: DEBUG nova.compute.manager [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1582.742382] env[68617]: DEBUG nova.virt.hardware [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1582.742627] env[68617]: DEBUG nova.virt.hardware [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1582.742781] env[68617]: DEBUG nova.virt.hardware [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1582.742964] env[68617]: DEBUG nova.virt.hardware [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1582.743123] env[68617]: DEBUG nova.virt.hardware [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1582.743282] env[68617]: DEBUG nova.virt.hardware [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1582.743521] env[68617]: DEBUG nova.virt.hardware [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1582.743683] env[68617]: DEBUG nova.virt.hardware [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1582.743849] env[68617]: DEBUG nova.virt.hardware [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1582.744018] env[68617]: DEBUG nova.virt.hardware [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1582.744193] env[68617]: DEBUG nova.virt.hardware [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1582.745064] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16d501c0-8940-45a0-b108-086567240b69 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1582.752877] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f458ab42-a256-4ef1-8bb9-c05c21783dfa {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1582.990771] env[68617]: DEBUG nova.network.neutron [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Successfully created port: 86787fb1-799e-4c1e-a651-a28ace59fc4b {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1583.536630] env[68617]: DEBUG nova.network.neutron [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Successfully updated port: 86787fb1-799e-4c1e-a651-a28ace59fc4b {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1583.550817] env[68617]: DEBUG oslo_concurrency.lockutils [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "refresh_cache-f54002b0-d60e-44ff-82a5-ef2f5193c48c" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1583.550967] env[68617]: DEBUG oslo_concurrency.lockutils [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquired lock "refresh_cache-f54002b0-d60e-44ff-82a5-ef2f5193c48c" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1583.551138] env[68617]: DEBUG nova.network.neutron [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1583.586833] env[68617]: DEBUG nova.network.neutron [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1583.764949] env[68617]: DEBUG nova.network.neutron [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Updating instance_info_cache with network_info: [{"id": "86787fb1-799e-4c1e-a651-a28ace59fc4b", "address": "fa:16:3e:4f:90:74", "network": {"id": "ad29e76d-388b-42ca-9526-7b6c236321e3", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1855301645-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4de7b27e9cf04c16b8dee80e756404fd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8e272539-d425-489f-9a63-aba692e88933", "external-id": "nsx-vlan-transportzone-869", "segmentation_id": 869, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap86787fb1-79", "ovs_interfaceid": "86787fb1-799e-4c1e-a651-a28ace59fc4b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1583.779849] env[68617]: DEBUG oslo_concurrency.lockutils [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Releasing lock "refresh_cache-f54002b0-d60e-44ff-82a5-ef2f5193c48c" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1583.780142] env[68617]: DEBUG nova.compute.manager [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Instance network_info: |[{"id": "86787fb1-799e-4c1e-a651-a28ace59fc4b", "address": "fa:16:3e:4f:90:74", "network": {"id": "ad29e76d-388b-42ca-9526-7b6c236321e3", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1855301645-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4de7b27e9cf04c16b8dee80e756404fd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8e272539-d425-489f-9a63-aba692e88933", "external-id": "nsx-vlan-transportzone-869", "segmentation_id": 869, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap86787fb1-79", "ovs_interfaceid": "86787fb1-799e-4c1e-a651-a28ace59fc4b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1583.780540] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:4f:90:74', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8e272539-d425-489f-9a63-aba692e88933', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '86787fb1-799e-4c1e-a651-a28ace59fc4b', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1583.789431] env[68617]: DEBUG oslo.service.loopingcall [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1583.790008] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1583.790247] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0156ed46-e980-4e9b-840f-b56776d09799 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1583.811030] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1583.811030] env[68617]: value = "task-3470841" [ 1583.811030] env[68617]: _type = "Task" [ 1583.811030] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1583.819104] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470841, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1583.986268] env[68617]: DEBUG nova.compute.manager [req-3843ca9d-02c9-4f7a-a087-f93f7fc04b25 req-bddf3888-5eb7-425a-940a-1defd6cff3ee service nova] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Received event network-vif-plugged-86787fb1-799e-4c1e-a651-a28ace59fc4b {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1583.986268] env[68617]: DEBUG oslo_concurrency.lockutils [req-3843ca9d-02c9-4f7a-a087-f93f7fc04b25 req-bddf3888-5eb7-425a-940a-1defd6cff3ee service nova] Acquiring lock "f54002b0-d60e-44ff-82a5-ef2f5193c48c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1583.986268] env[68617]: DEBUG oslo_concurrency.lockutils [req-3843ca9d-02c9-4f7a-a087-f93f7fc04b25 req-bddf3888-5eb7-425a-940a-1defd6cff3ee service nova] Lock "f54002b0-d60e-44ff-82a5-ef2f5193c48c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1583.986515] env[68617]: DEBUG oslo_concurrency.lockutils [req-3843ca9d-02c9-4f7a-a087-f93f7fc04b25 req-bddf3888-5eb7-425a-940a-1defd6cff3ee service nova] Lock "f54002b0-d60e-44ff-82a5-ef2f5193c48c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1583.986762] env[68617]: DEBUG nova.compute.manager [req-3843ca9d-02c9-4f7a-a087-f93f7fc04b25 req-bddf3888-5eb7-425a-940a-1defd6cff3ee service nova] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] No waiting events found dispatching network-vif-plugged-86787fb1-799e-4c1e-a651-a28ace59fc4b {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1583.986979] env[68617]: WARNING nova.compute.manager [req-3843ca9d-02c9-4f7a-a087-f93f7fc04b25 req-bddf3888-5eb7-425a-940a-1defd6cff3ee service nova] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Received unexpected event network-vif-plugged-86787fb1-799e-4c1e-a651-a28ace59fc4b for instance with vm_state building and task_state spawning. [ 1583.987270] env[68617]: DEBUG nova.compute.manager [req-3843ca9d-02c9-4f7a-a087-f93f7fc04b25 req-bddf3888-5eb7-425a-940a-1defd6cff3ee service nova] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Received event network-changed-86787fb1-799e-4c1e-a651-a28ace59fc4b {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1583.987533] env[68617]: DEBUG nova.compute.manager [req-3843ca9d-02c9-4f7a-a087-f93f7fc04b25 req-bddf3888-5eb7-425a-940a-1defd6cff3ee service nova] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Refreshing instance network info cache due to event network-changed-86787fb1-799e-4c1e-a651-a28ace59fc4b. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1583.987805] env[68617]: DEBUG oslo_concurrency.lockutils [req-3843ca9d-02c9-4f7a-a087-f93f7fc04b25 req-bddf3888-5eb7-425a-940a-1defd6cff3ee service nova] Acquiring lock "refresh_cache-f54002b0-d60e-44ff-82a5-ef2f5193c48c" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1583.987998] env[68617]: DEBUG oslo_concurrency.lockutils [req-3843ca9d-02c9-4f7a-a087-f93f7fc04b25 req-bddf3888-5eb7-425a-940a-1defd6cff3ee service nova] Acquired lock "refresh_cache-f54002b0-d60e-44ff-82a5-ef2f5193c48c" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1583.988257] env[68617]: DEBUG nova.network.neutron [req-3843ca9d-02c9-4f7a-a087-f93f7fc04b25 req-bddf3888-5eb7-425a-940a-1defd6cff3ee service nova] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Refreshing network info cache for port 86787fb1-799e-4c1e-a651-a28ace59fc4b {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1584.231056] env[68617]: DEBUG nova.network.neutron [req-3843ca9d-02c9-4f7a-a087-f93f7fc04b25 req-bddf3888-5eb7-425a-940a-1defd6cff3ee service nova] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Updated VIF entry in instance network info cache for port 86787fb1-799e-4c1e-a651-a28ace59fc4b. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1584.231429] env[68617]: DEBUG nova.network.neutron [req-3843ca9d-02c9-4f7a-a087-f93f7fc04b25 req-bddf3888-5eb7-425a-940a-1defd6cff3ee service nova] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Updating instance_info_cache with network_info: [{"id": "86787fb1-799e-4c1e-a651-a28ace59fc4b", "address": "fa:16:3e:4f:90:74", "network": {"id": "ad29e76d-388b-42ca-9526-7b6c236321e3", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1855301645-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4de7b27e9cf04c16b8dee80e756404fd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8e272539-d425-489f-9a63-aba692e88933", "external-id": "nsx-vlan-transportzone-869", "segmentation_id": 869, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap86787fb1-79", "ovs_interfaceid": "86787fb1-799e-4c1e-a651-a28ace59fc4b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1584.242180] env[68617]: DEBUG oslo_concurrency.lockutils [req-3843ca9d-02c9-4f7a-a087-f93f7fc04b25 req-bddf3888-5eb7-425a-940a-1defd6cff3ee service nova] Releasing lock "refresh_cache-f54002b0-d60e-44ff-82a5-ef2f5193c48c" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1584.320967] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470841, 'name': CreateVM_Task, 'duration_secs': 0.301601} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1584.321114] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1584.321793] env[68617]: DEBUG oslo_concurrency.lockutils [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1584.321956] env[68617]: DEBUG oslo_concurrency.lockutils [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1584.322282] env[68617]: DEBUG oslo_concurrency.lockutils [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1584.322524] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3d14d4a4-8550-4e69-b7c5-0555e0e5e70a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1584.326801] env[68617]: DEBUG oslo_vmware.api [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Waiting for the task: (returnval){ [ 1584.326801] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52d5b5d9-19e0-ac39-8dfb-6ef131015614" [ 1584.326801] env[68617]: _type = "Task" [ 1584.326801] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1584.335843] env[68617]: DEBUG oslo_vmware.api [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52d5b5d9-19e0-ac39-8dfb-6ef131015614, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1584.836810] env[68617]: DEBUG oslo_concurrency.lockutils [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1584.837063] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1584.837283] env[68617]: DEBUG oslo_concurrency.lockutils [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1589.608636] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Acquiring lock "922c8926-c636-4463-85d6-4f2a6325b85a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1589.608950] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Lock "922c8926-c636-4463-85d6-4f2a6325b85a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1610.411499] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1610.411848] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Starting heal instance info cache {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1610.411848] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Rebuilding the list of instances to heal {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1610.433247] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1610.433407] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1610.433538] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1610.433663] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1610.433786] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1610.433907] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1610.434034] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1610.434157] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1610.434274] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1610.434391] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1610.434512] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Didn't find any instances for network info cache update. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1612.699063] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1612.699063] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1614.695240] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1614.723056] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1614.723420] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1614.723531] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1614.723587] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68617) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1617.699777] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1617.712160] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1617.712936] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1617.712936] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1617.712936] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1617.713851] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b64fbc0-dc3b-43f9-946c-ecf561abb93c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1617.722848] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88857112-9d9d-472c-b306-d992fba5101a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1617.737789] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-885c43aa-7017-4d47-8d9a-906bfaa557fd {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1617.743905] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85f5b422-8d57-4c8b-b3b9-76ba6c9270fa {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1617.772981] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180923MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1617.773149] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1617.773342] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1617.842757] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance d46ca6f3-0ee9-412c-98b4-f639ce4f9228 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1617.842928] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance a8ff6232-530c-453a-96e4-f8ce00f976e3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1617.843080] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1617.843267] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance e90877a8-47d3-47d7-8362-5bcfe3a98c36 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1617.843324] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f03b9bc5-9438-4c0c-b595-72c631bece08 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1617.843439] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance ee6efd93-25be-4268-afe9-ba39e543a4fb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1617.843555] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 1605028f-5d6d-4ac4-8416-c0465982c53a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1617.843671] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance fc1043b8-535d-4af0-b92b-1f43580cdc9a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1617.843787] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1617.843899] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f54002b0-d60e-44ff-82a5-ef2f5193c48c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1617.854253] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 2c950cba-7698-48e0-8852-bf569f58f967 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1617.865087] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 12ed2a40-3d74-49a2-95b4-ccaaf58c8060 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1617.875981] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 21d0560a-fde3-4c16-b2fc-06d6f8668a7a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1617.886431] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 902b5ab9-23b8-450f-853a-b2da889c3afd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1617.896215] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 922c8926-c636-4463-85d6-4f2a6325b85a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1617.896471] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1617.896623] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1618.075397] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d332d055-51e2-4625-a613-3b5571582cd3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1618.082972] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79e837d8-4107-4ed7-bc15-2af1dc3eb686 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1618.112423] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b8c19da-df16-45e3-ac6c-041d2d4d234d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1618.119030] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1aaa732d-52a3-4cb2-8044-0f45029ee25d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1618.131363] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1618.139632] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1618.152108] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1618.152279] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.379s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1619.146769] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1619.699676] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1629.942882] env[68617]: WARNING oslo_vmware.rw_handles [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1629.942882] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1629.942882] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1629.942882] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1629.942882] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1629.942882] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 1629.942882] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1629.942882] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1629.942882] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1629.942882] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1629.942882] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1629.942882] env[68617]: ERROR oslo_vmware.rw_handles [ 1629.943580] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/a4bb14cf-506e-4695-a81a-e038c4195f15/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1629.945308] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1629.945576] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Copying Virtual Disk [datastore2] vmware_temp/a4bb14cf-506e-4695-a81a-e038c4195f15/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/a4bb14cf-506e-4695-a81a-e038c4195f15/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1629.945883] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-68003325-4f34-42f7-bbc6-b7753c122ff5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1629.954109] env[68617]: DEBUG oslo_vmware.api [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Waiting for the task: (returnval){ [ 1629.954109] env[68617]: value = "task-3470842" [ 1629.954109] env[68617]: _type = "Task" [ 1629.954109] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1629.961778] env[68617]: DEBUG oslo_vmware.api [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Task: {'id': task-3470842, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1630.464705] env[68617]: DEBUG oslo_vmware.exceptions [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1630.464944] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1630.465560] env[68617]: ERROR nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1630.465560] env[68617]: Faults: ['InvalidArgument'] [ 1630.465560] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Traceback (most recent call last): [ 1630.465560] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1630.465560] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] yield resources [ 1630.465560] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1630.465560] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] self.driver.spawn(context, instance, image_meta, [ 1630.465560] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1630.465560] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1630.465560] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1630.465560] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] self._fetch_image_if_missing(context, vi) [ 1630.465560] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1630.466162] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] image_cache(vi, tmp_image_ds_loc) [ 1630.466162] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1630.466162] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] vm_util.copy_virtual_disk( [ 1630.466162] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1630.466162] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] session._wait_for_task(vmdk_copy_task) [ 1630.466162] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1630.466162] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] return self.wait_for_task(task_ref) [ 1630.466162] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1630.466162] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] return evt.wait() [ 1630.466162] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1630.466162] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] result = hub.switch() [ 1630.466162] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1630.466162] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] return self.greenlet.switch() [ 1630.466576] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1630.466576] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] self.f(*self.args, **self.kw) [ 1630.466576] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1630.466576] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] raise exceptions.translate_fault(task_info.error) [ 1630.466576] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1630.466576] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Faults: ['InvalidArgument'] [ 1630.466576] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] [ 1630.466576] env[68617]: INFO nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Terminating instance [ 1630.467526] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1630.467745] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1630.467972] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-edce1b4e-73aa-42b9-88a5-fae530dcfa43 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1630.470294] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1630.470493] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1630.471194] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd601834-929f-4ae2-8bcb-5c7acb169f59 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1630.477621] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1630.477879] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3512e633-4a95-4e3a-9ac5-677274b25042 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1630.479950] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1630.480137] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1630.481063] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7352ddad-d126-473c-8df2-343dc9c68fab {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1630.485787] env[68617]: DEBUG oslo_vmware.api [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Waiting for the task: (returnval){ [ 1630.485787] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]525fbfb5-1f81-4276-3214-4eb6b3f20be6" [ 1630.485787] env[68617]: _type = "Task" [ 1630.485787] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1630.496093] env[68617]: DEBUG oslo_vmware.api [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]525fbfb5-1f81-4276-3214-4eb6b3f20be6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1630.547158] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1630.547381] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1630.547557] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Deleting the datastore file [datastore2] d46ca6f3-0ee9-412c-98b4-f639ce4f9228 {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1630.547959] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1545d606-55aa-45da-83e6-dca4256d67c0 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1630.553502] env[68617]: DEBUG oslo_vmware.api [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Waiting for the task: (returnval){ [ 1630.553502] env[68617]: value = "task-3470844" [ 1630.553502] env[68617]: _type = "Task" [ 1630.553502] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1630.561096] env[68617]: DEBUG oslo_vmware.api [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Task: {'id': task-3470844, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1630.996663] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1630.996920] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Creating directory with path [datastore2] vmware_temp/c5fc180f-2cfd-40ca-85ae-b71d88b1f484/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1630.997173] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-81491384-410e-4efb-8aac-de71776feb7a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1631.008545] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Created directory with path [datastore2] vmware_temp/c5fc180f-2cfd-40ca-85ae-b71d88b1f484/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1631.008736] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Fetch image to [datastore2] vmware_temp/c5fc180f-2cfd-40ca-85ae-b71d88b1f484/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1631.008954] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/c5fc180f-2cfd-40ca-85ae-b71d88b1f484/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1631.009685] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4207f387-71d8-47b7-8324-7db3407e17a9 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1631.017864] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f70ef69-7048-4d9d-b010-15ea3108183d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1631.026510] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f936a884-2df8-4bbd-88d9-6d7d9cff5911 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1631.058735] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91e039b9-0587-4e5c-bf03-8a6b13ac8197 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1631.065224] env[68617]: DEBUG oslo_vmware.api [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Task: {'id': task-3470844, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076183} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1631.066622] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1631.066810] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1631.066978] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1631.067168] env[68617]: INFO nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1631.068879] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f565b316-b8de-45a1-a8ac-15040772e983 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1631.070664] env[68617]: DEBUG nova.compute.claims [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1631.070833] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1631.071053] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1631.092525] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1631.142881] env[68617]: DEBUG oslo_vmware.rw_handles [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c5fc180f-2cfd-40ca-85ae-b71d88b1f484/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1631.204979] env[68617]: DEBUG oslo_vmware.rw_handles [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1631.205184] env[68617]: DEBUG oslo_vmware.rw_handles [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c5fc180f-2cfd-40ca-85ae-b71d88b1f484/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1631.341105] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0cb07ce-0bf1-46a7-9264-da7ae9c632f2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1631.349083] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8f6eecd-6568-42d6-ad8e-7c828d977fa1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1631.378892] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c3faeea-c39e-4be1-b481-4ac67768ffd3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1631.385760] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afe2280d-3e5a-4552-b1fc-45eee34f0ca5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1631.398344] env[68617]: DEBUG nova.compute.provider_tree [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1631.407272] env[68617]: DEBUG nova.scheduler.client.report [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1631.420702] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.350s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1631.421231] env[68617]: ERROR nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1631.421231] env[68617]: Faults: ['InvalidArgument'] [ 1631.421231] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Traceback (most recent call last): [ 1631.421231] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1631.421231] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] self.driver.spawn(context, instance, image_meta, [ 1631.421231] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1631.421231] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1631.421231] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1631.421231] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] self._fetch_image_if_missing(context, vi) [ 1631.421231] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1631.421231] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] image_cache(vi, tmp_image_ds_loc) [ 1631.421231] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1631.421546] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] vm_util.copy_virtual_disk( [ 1631.421546] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1631.421546] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] session._wait_for_task(vmdk_copy_task) [ 1631.421546] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1631.421546] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] return self.wait_for_task(task_ref) [ 1631.421546] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1631.421546] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] return evt.wait() [ 1631.421546] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1631.421546] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] result = hub.switch() [ 1631.421546] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1631.421546] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] return self.greenlet.switch() [ 1631.421546] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1631.421546] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] self.f(*self.args, **self.kw) [ 1631.421852] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1631.421852] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] raise exceptions.translate_fault(task_info.error) [ 1631.421852] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1631.421852] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Faults: ['InvalidArgument'] [ 1631.421852] env[68617]: ERROR nova.compute.manager [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] [ 1631.421972] env[68617]: DEBUG nova.compute.utils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1631.423223] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Build of instance d46ca6f3-0ee9-412c-98b4-f639ce4f9228 was re-scheduled: A specified parameter was not correct: fileType [ 1631.423223] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1631.423627] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1631.423802] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1631.423972] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1631.424152] env[68617]: DEBUG nova.network.neutron [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1631.725630] env[68617]: DEBUG nova.network.neutron [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1631.737053] env[68617]: INFO nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Took 0.31 seconds to deallocate network for instance. [ 1631.840676] env[68617]: INFO nova.scheduler.client.report [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Deleted allocations for instance d46ca6f3-0ee9-412c-98b4-f639ce4f9228 [ 1631.869441] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Lock "d46ca6f3-0ee9-412c-98b4-f639ce4f9228" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 534.384s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1631.870689] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "d46ca6f3-0ee9-412c-98b4-f639ce4f9228" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 342.329s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1631.870906] env[68617]: INFO nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] During sync_power_state the instance has a pending task (spawning). Skip. [ 1631.871151] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "d46ca6f3-0ee9-412c-98b4-f639ce4f9228" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1631.871718] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9d26d2ca-c76b-4b35-8148-3e3abf9c0836 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Lock "d46ca6f3-0ee9-412c-98b4-f639ce4f9228" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 337.846s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1631.871990] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9d26d2ca-c76b-4b35-8148-3e3abf9c0836 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquiring lock "d46ca6f3-0ee9-412c-98b4-f639ce4f9228-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1631.872737] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9d26d2ca-c76b-4b35-8148-3e3abf9c0836 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Lock "d46ca6f3-0ee9-412c-98b4-f639ce4f9228-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1631.872950] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9d26d2ca-c76b-4b35-8148-3e3abf9c0836 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Lock "d46ca6f3-0ee9-412c-98b4-f639ce4f9228-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1631.874922] env[68617]: INFO nova.compute.manager [None req-9d26d2ca-c76b-4b35-8148-3e3abf9c0836 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Terminating instance [ 1631.877601] env[68617]: DEBUG nova.compute.manager [None req-9d26d2ca-c76b-4b35-8148-3e3abf9c0836 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1631.877836] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-9d26d2ca-c76b-4b35-8148-3e3abf9c0836 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1631.878048] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-1d9cb8e5-2bd1-440f-ba14-4c5c16b3f308 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1631.888637] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-850fecb1-84eb-4f93-a96f-53b7bb6dd50a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1631.901544] env[68617]: DEBUG nova.compute.manager [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1631.922261] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-9d26d2ca-c76b-4b35-8148-3e3abf9c0836 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance d46ca6f3-0ee9-412c-98b4-f639ce4f9228 could not be found. [ 1631.922261] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-9d26d2ca-c76b-4b35-8148-3e3abf9c0836 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1631.922439] env[68617]: INFO nova.compute.manager [None req-9d26d2ca-c76b-4b35-8148-3e3abf9c0836 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1631.923513] env[68617]: DEBUG oslo.service.loopingcall [None req-9d26d2ca-c76b-4b35-8148-3e3abf9c0836 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1631.923513] env[68617]: DEBUG nova.compute.manager [-] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1631.923513] env[68617]: DEBUG nova.network.neutron [-] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1631.947375] env[68617]: DEBUG nova.network.neutron [-] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1631.954198] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1631.954434] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1631.955934] env[68617]: INFO nova.compute.claims [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1631.960279] env[68617]: INFO nova.compute.manager [-] [instance: d46ca6f3-0ee9-412c-98b4-f639ce4f9228] Took 0.04 seconds to deallocate network for instance. [ 1632.047118] env[68617]: DEBUG oslo_concurrency.lockutils [None req-9d26d2ca-c76b-4b35-8148-3e3abf9c0836 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Lock "d46ca6f3-0ee9-412c-98b4-f639ce4f9228" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.175s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1632.164563] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e27ab3e3-96aa-4015-b95e-4c2da901dab5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1632.172403] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7599c1d-b9df-4357-b8b6-219a962e53d4 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1632.202856] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fff15e3-db2c-4b03-bb19-b537424e8fd1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1632.209415] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-934b4445-9e18-4ba7-bf41-ddf25c2b2718 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1632.222116] env[68617]: DEBUG nova.compute.provider_tree [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1632.230649] env[68617]: DEBUG nova.scheduler.client.report [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1632.244676] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.290s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1632.245054] env[68617]: DEBUG nova.compute.manager [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1632.274939] env[68617]: DEBUG nova.compute.utils [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1632.276090] env[68617]: DEBUG nova.compute.manager [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1632.276260] env[68617]: DEBUG nova.network.neutron [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1632.285994] env[68617]: DEBUG nova.compute.manager [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1632.335868] env[68617]: DEBUG nova.policy [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1e8d6642b99a4079b559b5a6fbaf2a1d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '08802c2c6e5c4f509de416e847dd8cfd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 1632.349182] env[68617]: DEBUG nova.compute.manager [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1632.374870] env[68617]: DEBUG nova.virt.hardware [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1632.374870] env[68617]: DEBUG nova.virt.hardware [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1632.375563] env[68617]: DEBUG nova.virt.hardware [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1632.375728] env[68617]: DEBUG nova.virt.hardware [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1632.375885] env[68617]: DEBUG nova.virt.hardware [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1632.376047] env[68617]: DEBUG nova.virt.hardware [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1632.376261] env[68617]: DEBUG nova.virt.hardware [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1632.376426] env[68617]: DEBUG nova.virt.hardware [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1632.376588] env[68617]: DEBUG nova.virt.hardware [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1632.376746] env[68617]: DEBUG nova.virt.hardware [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1632.376910] env[68617]: DEBUG nova.virt.hardware [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1632.377764] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-457079f7-46a8-49c8-ae2a-cacca66ae100 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1632.385849] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b8360ec-a11e-4195-a669-8da951ffb2dc {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1632.620975] env[68617]: DEBUG nova.network.neutron [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Successfully created port: 205dfe1d-0c1e-4063-8d5b-7bf2ae678cb4 {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1633.205359] env[68617]: DEBUG nova.network.neutron [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Successfully updated port: 205dfe1d-0c1e-4063-8d5b-7bf2ae678cb4 {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1633.218166] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Acquiring lock "refresh_cache-2c950cba-7698-48e0-8852-bf569f58f967" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1633.218336] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Acquired lock "refresh_cache-2c950cba-7698-48e0-8852-bf569f58f967" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1633.218824] env[68617]: DEBUG nova.network.neutron [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1633.270240] env[68617]: DEBUG nova.network.neutron [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1633.414528] env[68617]: DEBUG nova.network.neutron [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Updating instance_info_cache with network_info: [{"id": "205dfe1d-0c1e-4063-8d5b-7bf2ae678cb4", "address": "fa:16:3e:73:88:af", "network": {"id": "bfe9f92d-2dc0-441f-811f-ee0196fd4988", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1329228587-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "08802c2c6e5c4f509de416e847dd8cfd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c883fb98-d172-4510-8cf4-07aafdf771af", "external-id": "nsx-vlan-transportzone-570", "segmentation_id": 570, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap205dfe1d-0c", "ovs_interfaceid": "205dfe1d-0c1e-4063-8d5b-7bf2ae678cb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1633.425142] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Releasing lock "refresh_cache-2c950cba-7698-48e0-8852-bf569f58f967" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1633.425973] env[68617]: DEBUG nova.compute.manager [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Instance network_info: |[{"id": "205dfe1d-0c1e-4063-8d5b-7bf2ae678cb4", "address": "fa:16:3e:73:88:af", "network": {"id": "bfe9f92d-2dc0-441f-811f-ee0196fd4988", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1329228587-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "08802c2c6e5c4f509de416e847dd8cfd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c883fb98-d172-4510-8cf4-07aafdf771af", "external-id": "nsx-vlan-transportzone-570", "segmentation_id": 570, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap205dfe1d-0c", "ovs_interfaceid": "205dfe1d-0c1e-4063-8d5b-7bf2ae678cb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1633.426624] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:73:88:af', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c883fb98-d172-4510-8cf4-07aafdf771af', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '205dfe1d-0c1e-4063-8d5b-7bf2ae678cb4', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1633.434441] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Creating folder: Project (08802c2c6e5c4f509de416e847dd8cfd). Parent ref: group-v693691. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1633.434921] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8c754115-a6f2-4fb6-8708-8b4ce24b6967 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1633.446888] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Created folder: Project (08802c2c6e5c4f509de416e847dd8cfd) in parent group-v693691. [ 1633.447086] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Creating folder: Instances. Parent ref: group-v693778. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1633.447313] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-59a29870-cc9b-42cc-bbd1-cb06924477a5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1633.456302] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Created folder: Instances in parent group-v693778. [ 1633.456523] env[68617]: DEBUG oslo.service.loopingcall [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1633.456700] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1633.456905] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f118118c-73f3-40bf-b330-c9038135d95f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1633.476336] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1633.476336] env[68617]: value = "task-3470847" [ 1633.476336] env[68617]: _type = "Task" [ 1633.476336] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1633.483685] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470847, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1633.918104] env[68617]: DEBUG nova.compute.manager [req-d9fe1058-3ca5-46b2-a7a7-803cda7c1894 req-04af0583-4fb7-4c9d-b071-e63f7c771964 service nova] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Received event network-vif-plugged-205dfe1d-0c1e-4063-8d5b-7bf2ae678cb4 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1633.918341] env[68617]: DEBUG oslo_concurrency.lockutils [req-d9fe1058-3ca5-46b2-a7a7-803cda7c1894 req-04af0583-4fb7-4c9d-b071-e63f7c771964 service nova] Acquiring lock "2c950cba-7698-48e0-8852-bf569f58f967-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1633.918553] env[68617]: DEBUG oslo_concurrency.lockutils [req-d9fe1058-3ca5-46b2-a7a7-803cda7c1894 req-04af0583-4fb7-4c9d-b071-e63f7c771964 service nova] Lock "2c950cba-7698-48e0-8852-bf569f58f967-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1633.918721] env[68617]: DEBUG oslo_concurrency.lockutils [req-d9fe1058-3ca5-46b2-a7a7-803cda7c1894 req-04af0583-4fb7-4c9d-b071-e63f7c771964 service nova] Lock "2c950cba-7698-48e0-8852-bf569f58f967-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1633.918885] env[68617]: DEBUG nova.compute.manager [req-d9fe1058-3ca5-46b2-a7a7-803cda7c1894 req-04af0583-4fb7-4c9d-b071-e63f7c771964 service nova] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] No waiting events found dispatching network-vif-plugged-205dfe1d-0c1e-4063-8d5b-7bf2ae678cb4 {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1633.919292] env[68617]: WARNING nova.compute.manager [req-d9fe1058-3ca5-46b2-a7a7-803cda7c1894 req-04af0583-4fb7-4c9d-b071-e63f7c771964 service nova] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Received unexpected event network-vif-plugged-205dfe1d-0c1e-4063-8d5b-7bf2ae678cb4 for instance with vm_state building and task_state spawning. [ 1633.919458] env[68617]: DEBUG nova.compute.manager [req-d9fe1058-3ca5-46b2-a7a7-803cda7c1894 req-04af0583-4fb7-4c9d-b071-e63f7c771964 service nova] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Received event network-changed-205dfe1d-0c1e-4063-8d5b-7bf2ae678cb4 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1633.919582] env[68617]: DEBUG nova.compute.manager [req-d9fe1058-3ca5-46b2-a7a7-803cda7c1894 req-04af0583-4fb7-4c9d-b071-e63f7c771964 service nova] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Refreshing instance network info cache due to event network-changed-205dfe1d-0c1e-4063-8d5b-7bf2ae678cb4. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1633.919769] env[68617]: DEBUG oslo_concurrency.lockutils [req-d9fe1058-3ca5-46b2-a7a7-803cda7c1894 req-04af0583-4fb7-4c9d-b071-e63f7c771964 service nova] Acquiring lock "refresh_cache-2c950cba-7698-48e0-8852-bf569f58f967" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1633.919904] env[68617]: DEBUG oslo_concurrency.lockutils [req-d9fe1058-3ca5-46b2-a7a7-803cda7c1894 req-04af0583-4fb7-4c9d-b071-e63f7c771964 service nova] Acquired lock "refresh_cache-2c950cba-7698-48e0-8852-bf569f58f967" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1633.920076] env[68617]: DEBUG nova.network.neutron [req-d9fe1058-3ca5-46b2-a7a7-803cda7c1894 req-04af0583-4fb7-4c9d-b071-e63f7c771964 service nova] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Refreshing network info cache for port 205dfe1d-0c1e-4063-8d5b-7bf2ae678cb4 {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1633.986108] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470847, 'name': CreateVM_Task, 'duration_secs': 0.286872} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1633.986285] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1633.988090] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1633.988090] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1633.988090] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1633.988090] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6e59f00c-e95c-46ce-a791-b22569c5e279 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1633.992315] env[68617]: DEBUG oslo_vmware.api [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Waiting for the task: (returnval){ [ 1633.992315] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]520aafce-d92c-48a2-b6a0-94e0af568781" [ 1633.992315] env[68617]: _type = "Task" [ 1633.992315] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1634.002230] env[68617]: DEBUG oslo_vmware.api [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]520aafce-d92c-48a2-b6a0-94e0af568781, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1634.309351] env[68617]: DEBUG nova.network.neutron [req-d9fe1058-3ca5-46b2-a7a7-803cda7c1894 req-04af0583-4fb7-4c9d-b071-e63f7c771964 service nova] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Updated VIF entry in instance network info cache for port 205dfe1d-0c1e-4063-8d5b-7bf2ae678cb4. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1634.309705] env[68617]: DEBUG nova.network.neutron [req-d9fe1058-3ca5-46b2-a7a7-803cda7c1894 req-04af0583-4fb7-4c9d-b071-e63f7c771964 service nova] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Updating instance_info_cache with network_info: [{"id": "205dfe1d-0c1e-4063-8d5b-7bf2ae678cb4", "address": "fa:16:3e:73:88:af", "network": {"id": "bfe9f92d-2dc0-441f-811f-ee0196fd4988", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1329228587-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "08802c2c6e5c4f509de416e847dd8cfd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c883fb98-d172-4510-8cf4-07aafdf771af", "external-id": "nsx-vlan-transportzone-570", "segmentation_id": 570, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap205dfe1d-0c", "ovs_interfaceid": "205dfe1d-0c1e-4063-8d5b-7bf2ae678cb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1634.320635] env[68617]: DEBUG oslo_concurrency.lockutils [req-d9fe1058-3ca5-46b2-a7a7-803cda7c1894 req-04af0583-4fb7-4c9d-b071-e63f7c771964 service nova] Releasing lock "refresh_cache-2c950cba-7698-48e0-8852-bf569f58f967" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1634.504524] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1634.504829] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1634.505056] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1669.698769] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1669.699161] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Starting heal instance info cache {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1669.699161] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Rebuilding the list of instances to heal {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1669.722438] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1669.722587] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1669.722710] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1669.722833] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1669.722958] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1669.723092] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1669.723213] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1669.723331] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1669.723446] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1669.723559] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1669.723678] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Didn't find any instances for network info cache update. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1673.700022] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1673.700415] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1674.699493] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1675.699599] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1675.699910] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1675.700010] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68617) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1677.458853] env[68617]: WARNING oslo_vmware.rw_handles [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1677.458853] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1677.458853] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1677.458853] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1677.458853] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1677.458853] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 1677.458853] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1677.458853] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1677.458853] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1677.458853] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1677.458853] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1677.458853] env[68617]: ERROR oslo_vmware.rw_handles [ 1677.458853] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/c5fc180f-2cfd-40ca-85ae-b71d88b1f484/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1677.460865] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1677.461124] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Copying Virtual Disk [datastore2] vmware_temp/c5fc180f-2cfd-40ca-85ae-b71d88b1f484/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/c5fc180f-2cfd-40ca-85ae-b71d88b1f484/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1677.461400] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ae95f279-2694-47c5-880c-fec0b30dcfe7 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1677.469841] env[68617]: DEBUG oslo_vmware.api [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Waiting for the task: (returnval){ [ 1677.469841] env[68617]: value = "task-3470848" [ 1677.469841] env[68617]: _type = "Task" [ 1677.469841] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1677.477902] env[68617]: DEBUG oslo_vmware.api [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Task: {'id': task-3470848, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1677.980967] env[68617]: DEBUG oslo_vmware.exceptions [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1677.981261] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1677.981818] env[68617]: ERROR nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1677.981818] env[68617]: Faults: ['InvalidArgument'] [ 1677.981818] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Traceback (most recent call last): [ 1677.981818] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1677.981818] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] yield resources [ 1677.981818] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1677.981818] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] self.driver.spawn(context, instance, image_meta, [ 1677.981818] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1677.981818] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1677.981818] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1677.981818] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] self._fetch_image_if_missing(context, vi) [ 1677.981818] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1677.982156] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] image_cache(vi, tmp_image_ds_loc) [ 1677.982156] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1677.982156] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] vm_util.copy_virtual_disk( [ 1677.982156] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1677.982156] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] session._wait_for_task(vmdk_copy_task) [ 1677.982156] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1677.982156] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] return self.wait_for_task(task_ref) [ 1677.982156] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1677.982156] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] return evt.wait() [ 1677.982156] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1677.982156] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] result = hub.switch() [ 1677.982156] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1677.982156] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] return self.greenlet.switch() [ 1677.982483] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1677.982483] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] self.f(*self.args, **self.kw) [ 1677.982483] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1677.982483] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] raise exceptions.translate_fault(task_info.error) [ 1677.982483] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1677.982483] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Faults: ['InvalidArgument'] [ 1677.982483] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] [ 1677.982483] env[68617]: INFO nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Terminating instance [ 1677.983674] env[68617]: DEBUG oslo_concurrency.lockutils [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1677.983890] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1677.984152] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-aad5126a-9269-4b3f-9b5f-af3c1f7791bd {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1677.986464] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1677.986602] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1677.987465] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1d59d2b-742b-4411-ba5b-0782a90e3716 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1677.994593] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1677.994808] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2335199c-16b0-4065-9996-1d3115d641be {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1677.997095] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1677.997285] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1677.998256] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0db0cd11-7f3e-455d-b4c7-3a0d4a389833 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1678.002947] env[68617]: DEBUG oslo_vmware.api [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Waiting for the task: (returnval){ [ 1678.002947] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52cb478e-7333-f2ca-d7ea-01323b599729" [ 1678.002947] env[68617]: _type = "Task" [ 1678.002947] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1678.011967] env[68617]: DEBUG oslo_vmware.api [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52cb478e-7333-f2ca-d7ea-01323b599729, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1678.072250] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1678.072603] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1678.072885] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Deleting the datastore file [datastore2] a8ff6232-530c-453a-96e4-f8ce00f976e3 {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1678.073241] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-cfd966ab-39a3-4138-ab55-66576763a380 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1678.079793] env[68617]: DEBUG oslo_vmware.api [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Waiting for the task: (returnval){ [ 1678.079793] env[68617]: value = "task-3470850" [ 1678.079793] env[68617]: _type = "Task" [ 1678.079793] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1678.087902] env[68617]: DEBUG oslo_vmware.api [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Task: {'id': task-3470850, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1678.513190] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1678.513472] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Creating directory with path [datastore2] vmware_temp/68582ced-a982-4e0b-bd3f-4467139f3f24/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1678.513713] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-29640256-b51f-471d-83b6-53b97ab4c7a8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1678.525221] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Created directory with path [datastore2] vmware_temp/68582ced-a982-4e0b-bd3f-4467139f3f24/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1678.525411] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Fetch image to [datastore2] vmware_temp/68582ced-a982-4e0b-bd3f-4467139f3f24/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1678.525579] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/68582ced-a982-4e0b-bd3f-4467139f3f24/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1678.526364] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d82e418d-2045-41b6-8cbf-f646b5f0e6c4 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1678.532709] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62edb4fa-ff72-4da8-b357-06c0a680f62e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1678.541627] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffff5be7-c5c8-48fc-9b69-4522728a4783 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1678.574276] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76173321-21ad-4f70-94ce-96a0246cce68 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1678.582983] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ef705b52-a0ae-4596-9486-effcc0bf8c49 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1678.589263] env[68617]: DEBUG oslo_vmware.api [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Task: {'id': task-3470850, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067068} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1678.589498] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1678.589675] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1678.589844] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1678.590023] env[68617]: INFO nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1678.592187] env[68617]: DEBUG nova.compute.claims [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1678.592357] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1678.592570] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1678.603843] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1678.657522] env[68617]: DEBUG oslo_vmware.rw_handles [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/68582ced-a982-4e0b-bd3f-4467139f3f24/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1678.717125] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1678.723575] env[68617]: DEBUG oslo_vmware.rw_handles [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1678.723746] env[68617]: DEBUG oslo_vmware.rw_handles [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/68582ced-a982-4e0b-bd3f-4467139f3f24/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1678.731119] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1678.866701] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b20328ab-8b5b-44e6-b14d-3d7cf6b29ca3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1678.874530] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5dec7b70-c388-4f85-811e-59359b73f98b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1678.904339] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8267552-535a-4914-b1a1-4442593d7c8b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1678.912055] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0757c47e-6aee-4c64-bc62-d422528334b6 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1678.926487] env[68617]: DEBUG nova.compute.provider_tree [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1678.935735] env[68617]: DEBUG nova.scheduler.client.report [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1678.948244] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.356s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1678.948765] env[68617]: ERROR nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1678.948765] env[68617]: Faults: ['InvalidArgument'] [ 1678.948765] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Traceback (most recent call last): [ 1678.948765] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1678.948765] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] self.driver.spawn(context, instance, image_meta, [ 1678.948765] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1678.948765] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1678.948765] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1678.948765] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] self._fetch_image_if_missing(context, vi) [ 1678.948765] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1678.948765] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] image_cache(vi, tmp_image_ds_loc) [ 1678.948765] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1678.949116] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] vm_util.copy_virtual_disk( [ 1678.949116] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1678.949116] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] session._wait_for_task(vmdk_copy_task) [ 1678.949116] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1678.949116] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] return self.wait_for_task(task_ref) [ 1678.949116] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1678.949116] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] return evt.wait() [ 1678.949116] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1678.949116] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] result = hub.switch() [ 1678.949116] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1678.949116] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] return self.greenlet.switch() [ 1678.949116] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1678.949116] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] self.f(*self.args, **self.kw) [ 1678.949415] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1678.949415] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] raise exceptions.translate_fault(task_info.error) [ 1678.949415] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1678.949415] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Faults: ['InvalidArgument'] [ 1678.949415] env[68617]: ERROR nova.compute.manager [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] [ 1678.949540] env[68617]: DEBUG nova.compute.utils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1678.950434] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.219s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1678.950607] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1678.950755] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1678.951402] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Build of instance a8ff6232-530c-453a-96e4-f8ce00f976e3 was re-scheduled: A specified parameter was not correct: fileType [ 1678.951402] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1678.951773] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1678.951988] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1678.952158] env[68617]: DEBUG nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1678.952270] env[68617]: DEBUG nova.network.neutron [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1678.954358] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0770b995-3102-48e8-bf73-7071d69102f8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1678.963267] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-811d920c-dd82-42ed-874b-2fdd6813e4c5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1678.977978] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a208aa26-af84-4f10-97a8-205c2db9b5fe {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1678.984663] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28390d16-6235-4af7-bc28-abfb8ee6e52f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1679.013470] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180947MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1679.013606] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1679.013813] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1679.101355] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance a8ff6232-530c-453a-96e4-f8ce00f976e3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1679.101528] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1679.101624] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance e90877a8-47d3-47d7-8362-5bcfe3a98c36 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1679.101755] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f03b9bc5-9438-4c0c-b595-72c631bece08 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1679.101864] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance ee6efd93-25be-4268-afe9-ba39e543a4fb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1679.101980] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 1605028f-5d6d-4ac4-8416-c0465982c53a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1679.102110] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance fc1043b8-535d-4af0-b92b-1f43580cdc9a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1679.102225] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1679.102338] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f54002b0-d60e-44ff-82a5-ef2f5193c48c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1679.102448] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 2c950cba-7698-48e0-8852-bf569f58f967 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1679.113563] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 12ed2a40-3d74-49a2-95b4-ccaaf58c8060 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1679.125930] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 21d0560a-fde3-4c16-b2fc-06d6f8668a7a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1679.137406] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 902b5ab9-23b8-450f-853a-b2da889c3afd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1679.149108] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 922c8926-c636-4463-85d6-4f2a6325b85a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1679.149359] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1679.149645] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1679.287701] env[68617]: DEBUG nova.network.neutron [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1679.300488] env[68617]: INFO nova.compute.manager [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Took 0.35 seconds to deallocate network for instance. [ 1679.358559] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eeda0e4d-75b5-4f68-934e-420c62bcab65 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1679.366595] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abf44dcf-647b-4373-ae94-3528461f2119 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1679.400484] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17c8099c-acc1-427e-a140-7901fa57257d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1679.408153] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4c7fa42-0084-462d-9ef9-e3bbd0360b83 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1679.422260] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1679.424148] env[68617]: INFO nova.scheduler.client.report [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Deleted allocations for instance a8ff6232-530c-453a-96e4-f8ce00f976e3 [ 1679.432819] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1679.445096] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d8351a0d-ccef-409a-acf3-08793a19f1c7 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Lock "a8ff6232-530c-453a-96e4-f8ce00f976e3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 581.927s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1679.445652] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "a8ff6232-530c-453a-96e4-f8ce00f976e3" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 389.904s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1679.445927] env[68617]: INFO nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] During sync_power_state the instance has a pending task (spawning). Skip. [ 1679.446166] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "a8ff6232-530c-453a-96e4-f8ce00f976e3" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1679.446783] env[68617]: DEBUG oslo_concurrency.lockutils [None req-b1ed08e4-6131-4183-bd2d-81d6cbb25b53 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Lock "a8ff6232-530c-453a-96e4-f8ce00f976e3" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 385.483s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1679.447080] env[68617]: DEBUG oslo_concurrency.lockutils [None req-b1ed08e4-6131-4183-bd2d-81d6cbb25b53 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Acquiring lock "a8ff6232-530c-453a-96e4-f8ce00f976e3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1679.447332] env[68617]: DEBUG oslo_concurrency.lockutils [None req-b1ed08e4-6131-4183-bd2d-81d6cbb25b53 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Lock "a8ff6232-530c-453a-96e4-f8ce00f976e3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1679.447539] env[68617]: DEBUG oslo_concurrency.lockutils [None req-b1ed08e4-6131-4183-bd2d-81d6cbb25b53 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Lock "a8ff6232-530c-453a-96e4-f8ce00f976e3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1679.449342] env[68617]: INFO nova.compute.manager [None req-b1ed08e4-6131-4183-bd2d-81d6cbb25b53 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Terminating instance [ 1679.451385] env[68617]: DEBUG nova.compute.manager [None req-b1ed08e4-6131-4183-bd2d-81d6cbb25b53 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1679.451638] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-b1ed08e4-6131-4183-bd2d-81d6cbb25b53 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1679.451946] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b95cf5c1-f2ea-4da9-8c3f-c3b807a1f3c1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1679.455327] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1679.455626] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.442s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1679.455966] env[68617]: DEBUG nova.compute.manager [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1679.464491] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c457e2f-ea98-4033-8566-863cdbc04089 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1679.494678] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-b1ed08e4-6131-4183-bd2d-81d6cbb25b53 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a8ff6232-530c-453a-96e4-f8ce00f976e3 could not be found. [ 1679.494883] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-b1ed08e4-6131-4183-bd2d-81d6cbb25b53 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1679.495072] env[68617]: INFO nova.compute.manager [None req-b1ed08e4-6131-4183-bd2d-81d6cbb25b53 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1679.495314] env[68617]: DEBUG oslo.service.loopingcall [None req-b1ed08e4-6131-4183-bd2d-81d6cbb25b53 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1679.497514] env[68617]: DEBUG nova.compute.manager [-] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1679.497619] env[68617]: DEBUG nova.network.neutron [-] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1679.512078] env[68617]: DEBUG oslo_concurrency.lockutils [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1679.512323] env[68617]: DEBUG oslo_concurrency.lockutils [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1679.514331] env[68617]: INFO nova.compute.claims [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1679.525990] env[68617]: DEBUG nova.network.neutron [-] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1679.541812] env[68617]: INFO nova.compute.manager [-] [instance: a8ff6232-530c-453a-96e4-f8ce00f976e3] Took 0.04 seconds to deallocate network for instance. [ 1679.627199] env[68617]: DEBUG oslo_concurrency.lockutils [None req-b1ed08e4-6131-4183-bd2d-81d6cbb25b53 tempest-ListServersNegativeTestJSON-361708762 tempest-ListServersNegativeTestJSON-361708762-project-member] Lock "a8ff6232-530c-453a-96e4-f8ce00f976e3" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.180s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1679.724546] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-715a31a4-067f-45eb-91ff-bfa5386269e7 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1679.733248] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f52b437-1923-4ba6-858f-3b1824c472cf {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1679.763419] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffe86497-d0df-4761-b14f-1c7c8f0866c4 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1679.771685] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad48a5c5-7de7-4428-adce-cffb8009448a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1679.787289] env[68617]: DEBUG nova.compute.provider_tree [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1679.795980] env[68617]: DEBUG nova.scheduler.client.report [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1679.811239] env[68617]: DEBUG oslo_concurrency.lockutils [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.299s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1679.811735] env[68617]: DEBUG nova.compute.manager [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1679.848657] env[68617]: DEBUG nova.compute.utils [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1679.849518] env[68617]: DEBUG nova.compute.manager [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1679.849685] env[68617]: DEBUG nova.network.neutron [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1679.861040] env[68617]: DEBUG nova.compute.manager [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1679.913237] env[68617]: DEBUG nova.policy [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '546f17dfba284c76b4ff2dde1a09928a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '162ecdbf203345a5b63167459e388608', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 1679.922196] env[68617]: DEBUG nova.compute.manager [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1679.946882] env[68617]: DEBUG nova.virt.hardware [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1679.947152] env[68617]: DEBUG nova.virt.hardware [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1679.947313] env[68617]: DEBUG nova.virt.hardware [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1679.947527] env[68617]: DEBUG nova.virt.hardware [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1679.947705] env[68617]: DEBUG nova.virt.hardware [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1679.947949] env[68617]: DEBUG nova.virt.hardware [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1679.948201] env[68617]: DEBUG nova.virt.hardware [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1679.948365] env[68617]: DEBUG nova.virt.hardware [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1679.948532] env[68617]: DEBUG nova.virt.hardware [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1679.948691] env[68617]: DEBUG nova.virt.hardware [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1679.948863] env[68617]: DEBUG nova.virt.hardware [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1679.949758] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf784427-2377-4237-bad2-b6044c96b16b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1679.957952] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be709821-3860-408a-b424-821503b26f1e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1680.200118] env[68617]: DEBUG nova.network.neutron [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Successfully created port: 30741012-9add-440d-ab33-4a49d699409d {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1680.435414] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1680.435414] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1680.751198] env[68617]: DEBUG nova.network.neutron [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Successfully updated port: 30741012-9add-440d-ab33-4a49d699409d {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1680.767321] env[68617]: DEBUG oslo_concurrency.lockutils [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquiring lock "refresh_cache-12ed2a40-3d74-49a2-95b4-ccaaf58c8060" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1680.767455] env[68617]: DEBUG oslo_concurrency.lockutils [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquired lock "refresh_cache-12ed2a40-3d74-49a2-95b4-ccaaf58c8060" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1680.767579] env[68617]: DEBUG nova.network.neutron [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1680.804871] env[68617]: DEBUG nova.network.neutron [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1680.963709] env[68617]: DEBUG nova.network.neutron [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Updating instance_info_cache with network_info: [{"id": "30741012-9add-440d-ab33-4a49d699409d", "address": "fa:16:3e:a0:1a:2b", "network": {"id": "e6650a9f-f26d-481d-8658-10ff40328891", "bridge": "br-int", "label": "tempest-ServersTestJSON-1149134727-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "162ecdbf203345a5b63167459e388608", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "aa09e855-8af1-419b-b78d-8ffcc94b1bfb", "external-id": "nsx-vlan-transportzone-901", "segmentation_id": 901, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap30741012-9a", "ovs_interfaceid": "30741012-9add-440d-ab33-4a49d699409d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1680.976318] env[68617]: DEBUG oslo_concurrency.lockutils [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Releasing lock "refresh_cache-12ed2a40-3d74-49a2-95b4-ccaaf58c8060" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1680.976604] env[68617]: DEBUG nova.compute.manager [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Instance network_info: |[{"id": "30741012-9add-440d-ab33-4a49d699409d", "address": "fa:16:3e:a0:1a:2b", "network": {"id": "e6650a9f-f26d-481d-8658-10ff40328891", "bridge": "br-int", "label": "tempest-ServersTestJSON-1149134727-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "162ecdbf203345a5b63167459e388608", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "aa09e855-8af1-419b-b78d-8ffcc94b1bfb", "external-id": "nsx-vlan-transportzone-901", "segmentation_id": 901, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap30741012-9a", "ovs_interfaceid": "30741012-9add-440d-ab33-4a49d699409d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1680.977019] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a0:1a:2b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'aa09e855-8af1-419b-b78d-8ffcc94b1bfb', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '30741012-9add-440d-ab33-4a49d699409d', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1680.984975] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Creating folder: Project (162ecdbf203345a5b63167459e388608). Parent ref: group-v693691. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1680.985590] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9603b676-2687-4821-922f-c368df974fc2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1680.996599] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Created folder: Project (162ecdbf203345a5b63167459e388608) in parent group-v693691. [ 1680.996775] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Creating folder: Instances. Parent ref: group-v693781. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1680.997072] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-43a8ae9d-af84-4359-84db-4849b4cc1d32 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1681.005591] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Created folder: Instances in parent group-v693781. [ 1681.005816] env[68617]: DEBUG oslo.service.loopingcall [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1681.006062] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1681.006258] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-81dfcfbd-cb5f-4b3e-a861-9db30751820b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1681.025920] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1681.025920] env[68617]: value = "task-3470853" [ 1681.025920] env[68617]: _type = "Task" [ 1681.025920] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1681.034352] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470853, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1681.389308] env[68617]: DEBUG nova.compute.manager [req-48f74d48-00ef-476a-bc59-d4dce7b4fb19 req-71937811-d2b4-4a2f-8bc4-112a33d8b6a8 service nova] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Received event network-vif-plugged-30741012-9add-440d-ab33-4a49d699409d {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1681.389490] env[68617]: DEBUG oslo_concurrency.lockutils [req-48f74d48-00ef-476a-bc59-d4dce7b4fb19 req-71937811-d2b4-4a2f-8bc4-112a33d8b6a8 service nova] Acquiring lock "12ed2a40-3d74-49a2-95b4-ccaaf58c8060-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1681.389703] env[68617]: DEBUG oslo_concurrency.lockutils [req-48f74d48-00ef-476a-bc59-d4dce7b4fb19 req-71937811-d2b4-4a2f-8bc4-112a33d8b6a8 service nova] Lock "12ed2a40-3d74-49a2-95b4-ccaaf58c8060-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1681.389873] env[68617]: DEBUG oslo_concurrency.lockutils [req-48f74d48-00ef-476a-bc59-d4dce7b4fb19 req-71937811-d2b4-4a2f-8bc4-112a33d8b6a8 service nova] Lock "12ed2a40-3d74-49a2-95b4-ccaaf58c8060-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1681.390186] env[68617]: DEBUG nova.compute.manager [req-48f74d48-00ef-476a-bc59-d4dce7b4fb19 req-71937811-d2b4-4a2f-8bc4-112a33d8b6a8 service nova] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] No waiting events found dispatching network-vif-plugged-30741012-9add-440d-ab33-4a49d699409d {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1681.390393] env[68617]: WARNING nova.compute.manager [req-48f74d48-00ef-476a-bc59-d4dce7b4fb19 req-71937811-d2b4-4a2f-8bc4-112a33d8b6a8 service nova] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Received unexpected event network-vif-plugged-30741012-9add-440d-ab33-4a49d699409d for instance with vm_state building and task_state spawning. [ 1681.390559] env[68617]: DEBUG nova.compute.manager [req-48f74d48-00ef-476a-bc59-d4dce7b4fb19 req-71937811-d2b4-4a2f-8bc4-112a33d8b6a8 service nova] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Received event network-changed-30741012-9add-440d-ab33-4a49d699409d {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1681.390716] env[68617]: DEBUG nova.compute.manager [req-48f74d48-00ef-476a-bc59-d4dce7b4fb19 req-71937811-d2b4-4a2f-8bc4-112a33d8b6a8 service nova] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Refreshing instance network info cache due to event network-changed-30741012-9add-440d-ab33-4a49d699409d. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1681.390919] env[68617]: DEBUG oslo_concurrency.lockutils [req-48f74d48-00ef-476a-bc59-d4dce7b4fb19 req-71937811-d2b4-4a2f-8bc4-112a33d8b6a8 service nova] Acquiring lock "refresh_cache-12ed2a40-3d74-49a2-95b4-ccaaf58c8060" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1681.391049] env[68617]: DEBUG oslo_concurrency.lockutils [req-48f74d48-00ef-476a-bc59-d4dce7b4fb19 req-71937811-d2b4-4a2f-8bc4-112a33d8b6a8 service nova] Acquired lock "refresh_cache-12ed2a40-3d74-49a2-95b4-ccaaf58c8060" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1681.391211] env[68617]: DEBUG nova.network.neutron [req-48f74d48-00ef-476a-bc59-d4dce7b4fb19 req-71937811-d2b4-4a2f-8bc4-112a33d8b6a8 service nova] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Refreshing network info cache for port 30741012-9add-440d-ab33-4a49d699409d {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1681.536899] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470853, 'name': CreateVM_Task, 'duration_secs': 0.274075} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1681.537118] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1681.537730] env[68617]: DEBUG oslo_concurrency.lockutils [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1681.537893] env[68617]: DEBUG oslo_concurrency.lockutils [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1681.538222] env[68617]: DEBUG oslo_concurrency.lockutils [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1681.538458] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c2ecd02b-3258-4eda-bec9-763911d1b5e8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1681.544292] env[68617]: DEBUG oslo_vmware.api [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Waiting for the task: (returnval){ [ 1681.544292] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]520a20e2-5688-a3b5-33a8-e3b05c6d8d64" [ 1681.544292] env[68617]: _type = "Task" [ 1681.544292] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1681.551591] env[68617]: DEBUG oslo_vmware.api [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]520a20e2-5688-a3b5-33a8-e3b05c6d8d64, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1681.626982] env[68617]: DEBUG nova.network.neutron [req-48f74d48-00ef-476a-bc59-d4dce7b4fb19 req-71937811-d2b4-4a2f-8bc4-112a33d8b6a8 service nova] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Updated VIF entry in instance network info cache for port 30741012-9add-440d-ab33-4a49d699409d. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1681.627341] env[68617]: DEBUG nova.network.neutron [req-48f74d48-00ef-476a-bc59-d4dce7b4fb19 req-71937811-d2b4-4a2f-8bc4-112a33d8b6a8 service nova] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Updating instance_info_cache with network_info: [{"id": "30741012-9add-440d-ab33-4a49d699409d", "address": "fa:16:3e:a0:1a:2b", "network": {"id": "e6650a9f-f26d-481d-8658-10ff40328891", "bridge": "br-int", "label": "tempest-ServersTestJSON-1149134727-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "162ecdbf203345a5b63167459e388608", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "aa09e855-8af1-419b-b78d-8ffcc94b1bfb", "external-id": "nsx-vlan-transportzone-901", "segmentation_id": 901, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap30741012-9a", "ovs_interfaceid": "30741012-9add-440d-ab33-4a49d699409d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1681.636974] env[68617]: DEBUG oslo_concurrency.lockutils [req-48f74d48-00ef-476a-bc59-d4dce7b4fb19 req-71937811-d2b4-4a2f-8bc4-112a33d8b6a8 service nova] Releasing lock "refresh_cache-12ed2a40-3d74-49a2-95b4-ccaaf58c8060" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1682.055012] env[68617]: DEBUG oslo_concurrency.lockutils [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1682.055012] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1682.055012] env[68617]: DEBUG oslo_concurrency.lockutils [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1695.035047] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "b1a8dc60-af98-4f80-96cf-b2550ea8c13a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1695.035522] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "b1a8dc60-af98-4f80-96cf-b2550ea8c13a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1698.064803] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Acquiring lock "a4ab788d-327a-47cc-8ae7-e1b9be889759" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1698.065139] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Lock "a4ab788d-327a-47cc-8ae7-e1b9be889759" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1721.586593] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2b706b28-1b8c-4103-9ebc-58c321e14b9f tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Acquiring lock "fe0d64a6-6ce6-4ef5-8ae1-a160c5ec0987" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1721.586952] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2b706b28-1b8c-4103-9ebc-58c321e14b9f tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Lock "fe0d64a6-6ce6-4ef5-8ae1-a160c5ec0987" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1727.973110] env[68617]: WARNING oslo_vmware.rw_handles [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1727.973110] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1727.973110] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1727.973110] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1727.973110] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1727.973110] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 1727.973110] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1727.973110] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1727.973110] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1727.973110] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1727.973110] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1727.973110] env[68617]: ERROR oslo_vmware.rw_handles [ 1727.973110] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/68582ced-a982-4e0b-bd3f-4467139f3f24/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1727.974835] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1727.975091] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Copying Virtual Disk [datastore2] vmware_temp/68582ced-a982-4e0b-bd3f-4467139f3f24/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/68582ced-a982-4e0b-bd3f-4467139f3f24/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1727.975394] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-b8b2892a-53d1-43f5-b710-a37c9dfcaa1d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1727.983838] env[68617]: DEBUG oslo_vmware.api [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Waiting for the task: (returnval){ [ 1727.983838] env[68617]: value = "task-3470854" [ 1727.983838] env[68617]: _type = "Task" [ 1727.983838] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1727.991607] env[68617]: DEBUG oslo_vmware.api [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Task: {'id': task-3470854, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1728.495363] env[68617]: DEBUG oslo_vmware.exceptions [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1728.495696] env[68617]: DEBUG oslo_concurrency.lockutils [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1728.496394] env[68617]: ERROR nova.compute.manager [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1728.496394] env[68617]: Faults: ['InvalidArgument'] [ 1728.496394] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Traceback (most recent call last): [ 1728.496394] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1728.496394] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] yield resources [ 1728.496394] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1728.496394] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] self.driver.spawn(context, instance, image_meta, [ 1728.496394] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1728.496394] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1728.496394] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1728.496394] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] self._fetch_image_if_missing(context, vi) [ 1728.496394] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1728.496811] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] image_cache(vi, tmp_image_ds_loc) [ 1728.496811] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1728.496811] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] vm_util.copy_virtual_disk( [ 1728.496811] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1728.496811] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] session._wait_for_task(vmdk_copy_task) [ 1728.496811] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1728.496811] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] return self.wait_for_task(task_ref) [ 1728.496811] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1728.496811] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] return evt.wait() [ 1728.496811] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1728.496811] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] result = hub.switch() [ 1728.496811] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1728.496811] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] return self.greenlet.switch() [ 1728.497176] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1728.497176] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] self.f(*self.args, **self.kw) [ 1728.497176] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1728.497176] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] raise exceptions.translate_fault(task_info.error) [ 1728.497176] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1728.497176] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Faults: ['InvalidArgument'] [ 1728.497176] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] [ 1728.497176] env[68617]: INFO nova.compute.manager [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Terminating instance [ 1728.498237] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1728.498440] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1728.498697] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5589865d-b53b-4144-993b-bc394e4450cc {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1728.500817] env[68617]: DEBUG nova.compute.manager [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1728.501014] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1728.501726] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebe87a3f-c1a5-435a-b9d1-1f0803fe5553 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1728.509876] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1728.510112] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b34bb6ef-a493-4080-9ada-0e35dcc31664 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1728.512433] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1728.512632] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1728.513599] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-db04ae97-abfd-401b-8de7-5a8684f10b9b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1728.518227] env[68617]: DEBUG oslo_vmware.api [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Waiting for the task: (returnval){ [ 1728.518227] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]5287b102-3d9f-c38a-c616-b89af3b26a28" [ 1728.518227] env[68617]: _type = "Task" [ 1728.518227] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1728.525260] env[68617]: DEBUG oslo_vmware.api [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]5287b102-3d9f-c38a-c616-b89af3b26a28, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1728.579277] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1728.579502] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1728.579668] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Deleting the datastore file [datastore2] 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1728.579919] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-424cd257-5501-49ae-a6da-bd4addb7879e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1728.586201] env[68617]: DEBUG oslo_vmware.api [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Waiting for the task: (returnval){ [ 1728.586201] env[68617]: value = "task-3470856" [ 1728.586201] env[68617]: _type = "Task" [ 1728.586201] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1728.593828] env[68617]: DEBUG oslo_vmware.api [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Task: {'id': task-3470856, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1729.028659] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1729.029019] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Creating directory with path [datastore2] vmware_temp/3f38bbfe-ca2e-4fbf-8252-e60dd940820b/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1729.029121] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6403ebfa-897a-4aad-accd-9f5f8e5562c8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1729.039939] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Created directory with path [datastore2] vmware_temp/3f38bbfe-ca2e-4fbf-8252-e60dd940820b/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1729.040121] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Fetch image to [datastore2] vmware_temp/3f38bbfe-ca2e-4fbf-8252-e60dd940820b/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1729.040276] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/3f38bbfe-ca2e-4fbf-8252-e60dd940820b/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1729.040952] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e748fc9-8cd9-4f3b-9a4e-e6e9cb0474d8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1729.047148] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b750f47b-63c4-49de-a422-953edab86284 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1729.055971] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e289e868-df35-4dbc-ac2a-9483b7cfcf96 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1729.090321] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc0e21d6-e05b-4ed9-85e2-8c614eb94d27 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1729.097221] env[68617]: DEBUG oslo_vmware.api [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Task: {'id': task-3470856, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.06584} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1729.098634] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1729.098828] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1729.098999] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1729.099188] env[68617]: INFO nova.compute.manager [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1729.101042] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4b6f03ad-0bd4-4dc5-8eb0-904ca9769bf4 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1729.102716] env[68617]: DEBUG nova.compute.claims [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1729.102888] env[68617]: DEBUG oslo_concurrency.lockutils [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1729.103109] env[68617]: DEBUG oslo_concurrency.lockutils [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1729.124041] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1729.272639] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1729.273564] env[68617]: ERROR nova.compute.manager [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image c87eab51-bc9a-44dc-8f0d-7ab73283e453. [ 1729.273564] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Traceback (most recent call last): [ 1729.273564] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1729.273564] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1729.273564] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1729.273564] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] result = getattr(controller, method)(*args, **kwargs) [ 1729.273564] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1729.273564] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return self._get(image_id) [ 1729.273564] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1729.273564] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1729.273564] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1729.273908] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] resp, body = self.http_client.get(url, headers=header) [ 1729.273908] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1729.273908] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return self.request(url, 'GET', **kwargs) [ 1729.273908] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1729.273908] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return self._handle_response(resp) [ 1729.273908] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1729.273908] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] raise exc.from_response(resp, resp.content) [ 1729.273908] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1729.273908] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] [ 1729.273908] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] During handling of the above exception, another exception occurred: [ 1729.273908] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] [ 1729.273908] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Traceback (most recent call last): [ 1729.274297] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1729.274297] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] yield resources [ 1729.274297] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1729.274297] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] self.driver.spawn(context, instance, image_meta, [ 1729.274297] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1729.274297] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1729.274297] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1729.274297] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] self._fetch_image_if_missing(context, vi) [ 1729.274297] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1729.274297] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] image_fetch(context, vi, tmp_image_ds_loc) [ 1729.274297] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1729.274297] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] images.fetch_image( [ 1729.274297] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1729.274746] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] metadata = IMAGE_API.get(context, image_ref) [ 1729.274746] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1729.274746] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return session.show(context, image_id, [ 1729.274746] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1729.274746] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] _reraise_translated_image_exception(image_id) [ 1729.274746] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1729.274746] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] raise new_exc.with_traceback(exc_trace) [ 1729.274746] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1729.274746] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1729.274746] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1729.274746] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] result = getattr(controller, method)(*args, **kwargs) [ 1729.274746] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1729.274746] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return self._get(image_id) [ 1729.275154] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1729.275154] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1729.275154] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1729.275154] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] resp, body = self.http_client.get(url, headers=header) [ 1729.275154] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1729.275154] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return self.request(url, 'GET', **kwargs) [ 1729.275154] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1729.275154] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return self._handle_response(resp) [ 1729.275154] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1729.275154] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] raise exc.from_response(resp, resp.content) [ 1729.275154] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] nova.exception.ImageNotAuthorized: Not authorized for image c87eab51-bc9a-44dc-8f0d-7ab73283e453. [ 1729.275154] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] [ 1729.275508] env[68617]: INFO nova.compute.manager [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Terminating instance [ 1729.275508] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1729.275581] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1729.276163] env[68617]: DEBUG nova.compute.manager [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1729.276376] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1729.278555] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-85514114-8e9f-4cae-a862-ca036e3f80e6 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1729.282493] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a6be200-cd98-40f7-880a-f56aedf79371 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1729.289190] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1729.289470] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-637cb03d-40a4-4f10-9bf4-ab3597a7b42c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1729.291695] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1729.291865] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1729.292778] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6a3fb487-4d0b-4189-9795-52606e682784 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1729.299503] env[68617]: DEBUG oslo_vmware.api [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Waiting for the task: (returnval){ [ 1729.299503] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52e510dd-ef83-a8df-9994-3d622d3acf87" [ 1729.299503] env[68617]: _type = "Task" [ 1729.299503] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1729.309920] env[68617]: DEBUG oslo_vmware.api [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52e510dd-ef83-a8df-9994-3d622d3acf87, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1729.352028] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1729.352261] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1729.352409] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Deleting the datastore file [datastore2] e90877a8-47d3-47d7-8362-5bcfe3a98c36 {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1729.352665] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ae9b5611-f7c9-4508-8408-4a88a15225db {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1729.359302] env[68617]: DEBUG oslo_vmware.api [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Waiting for the task: (returnval){ [ 1729.359302] env[68617]: value = "task-3470858" [ 1729.359302] env[68617]: _type = "Task" [ 1729.359302] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1729.368596] env[68617]: DEBUG oslo_vmware.api [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Task: {'id': task-3470858, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1729.384821] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c01ce15-29e9-47ca-a070-8c3b5cbf727c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1729.391549] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18917a4b-d5a4-46c7-a08a-f210d96a99bf {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1729.421926] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ae7eb98-7a25-412e-82b9-329c55422d8b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1729.429030] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-583ee44c-c4fb-4773-90f9-a0dfc6e0f4cd {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1729.442917] env[68617]: DEBUG nova.compute.provider_tree [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1729.455205] env[68617]: DEBUG nova.scheduler.client.report [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1729.468867] env[68617]: DEBUG oslo_concurrency.lockutils [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.366s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1729.469419] env[68617]: ERROR nova.compute.manager [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1729.469419] env[68617]: Faults: ['InvalidArgument'] [ 1729.469419] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Traceback (most recent call last): [ 1729.469419] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1729.469419] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] self.driver.spawn(context, instance, image_meta, [ 1729.469419] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1729.469419] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1729.469419] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1729.469419] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] self._fetch_image_if_missing(context, vi) [ 1729.469419] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1729.469419] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] image_cache(vi, tmp_image_ds_loc) [ 1729.469419] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1729.469798] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] vm_util.copy_virtual_disk( [ 1729.469798] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1729.469798] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] session._wait_for_task(vmdk_copy_task) [ 1729.469798] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1729.469798] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] return self.wait_for_task(task_ref) [ 1729.469798] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1729.469798] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] return evt.wait() [ 1729.469798] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1729.469798] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] result = hub.switch() [ 1729.469798] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1729.469798] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] return self.greenlet.switch() [ 1729.469798] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1729.469798] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] self.f(*self.args, **self.kw) [ 1729.470170] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1729.470170] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] raise exceptions.translate_fault(task_info.error) [ 1729.470170] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1729.470170] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Faults: ['InvalidArgument'] [ 1729.470170] env[68617]: ERROR nova.compute.manager [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] [ 1729.470170] env[68617]: DEBUG nova.compute.utils [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1729.471580] env[68617]: DEBUG nova.compute.manager [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Build of instance 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d was re-scheduled: A specified parameter was not correct: fileType [ 1729.471580] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1729.471937] env[68617]: DEBUG nova.compute.manager [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1729.472121] env[68617]: DEBUG nova.compute.manager [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1729.472289] env[68617]: DEBUG nova.compute.manager [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1729.472447] env[68617]: DEBUG nova.network.neutron [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1729.809492] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1729.809764] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Creating directory with path [datastore2] vmware_temp/93d19a37-bb3a-4c4b-a665-2fb3c480306f/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1729.809999] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b278934a-8ee5-49b8-9796-cc046995377f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1729.823045] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Created directory with path [datastore2] vmware_temp/93d19a37-bb3a-4c4b-a665-2fb3c480306f/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1729.823165] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Fetch image to [datastore2] vmware_temp/93d19a37-bb3a-4c4b-a665-2fb3c480306f/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1729.823302] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/93d19a37-bb3a-4c4b-a665-2fb3c480306f/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1729.824060] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14eb9681-f555-4efc-9e79-a0bba2038ac5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1729.833243] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45bb1c82-5bb9-4bf2-8423-0babdc05294a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1729.843188] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0dee0d4-4361-4baf-a528-c81a42d55b2a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1729.847230] env[68617]: DEBUG nova.network.neutron [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1729.877835] env[68617]: INFO nova.compute.manager [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Took 0.41 seconds to deallocate network for instance. [ 1729.883402] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6519cc67-6af7-4640-80b3-576e3c8eb49b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1729.891353] env[68617]: DEBUG oslo_vmware.api [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Task: {'id': task-3470858, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074139} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1729.892889] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1729.893092] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1729.893272] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1729.894298] env[68617]: INFO nova.compute.manager [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1729.895390] env[68617]: DEBUG nova.compute.claims [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1729.895550] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1729.895981] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1729.898247] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e98596da-c5ba-4954-b051-34b38944d8fb {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1729.922827] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1729.982298] env[68617]: DEBUG oslo_vmware.rw_handles [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/93d19a37-bb3a-4c4b-a665-2fb3c480306f/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1730.038581] env[68617]: INFO nova.scheduler.client.report [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Deleted allocations for instance 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d [ 1730.049150] env[68617]: DEBUG oslo_vmware.rw_handles [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1730.049336] env[68617]: DEBUG oslo_vmware.rw_handles [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/93d19a37-bb3a-4c4b-a665-2fb3c480306f/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1730.063926] env[68617]: DEBUG oslo_concurrency.lockutils [None req-32ec2cf5-6002-4ad5-9e7d-aabbf19ca706 tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Lock "5f31aef1-4806-48e1-9d5a-5dff09ea0f0d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 625.345s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1730.065408] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "5f31aef1-4806-48e1-9d5a-5dff09ea0f0d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 440.523s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1730.065408] env[68617]: INFO nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] During sync_power_state the instance has a pending task (spawning). Skip. [ 1730.065548] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "5f31aef1-4806-48e1-9d5a-5dff09ea0f0d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1730.066077] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ea1854ca-db59-4b97-97e1-e32424d8ed6a tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Lock "5f31aef1-4806-48e1-9d5a-5dff09ea0f0d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 428.507s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1730.066356] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ea1854ca-db59-4b97-97e1-e32424d8ed6a tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Acquiring lock "5f31aef1-4806-48e1-9d5a-5dff09ea0f0d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1730.066576] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ea1854ca-db59-4b97-97e1-e32424d8ed6a tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Lock "5f31aef1-4806-48e1-9d5a-5dff09ea0f0d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1730.066740] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ea1854ca-db59-4b97-97e1-e32424d8ed6a tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Lock "5f31aef1-4806-48e1-9d5a-5dff09ea0f0d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1730.068557] env[68617]: INFO nova.compute.manager [None req-ea1854ca-db59-4b97-97e1-e32424d8ed6a tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Terminating instance [ 1730.070323] env[68617]: DEBUG nova.compute.manager [None req-ea1854ca-db59-4b97-97e1-e32424d8ed6a tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1730.070559] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ea1854ca-db59-4b97-97e1-e32424d8ed6a tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1730.070838] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-37f6ea37-c4b0-4810-98ed-daa20b9c0cc4 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1730.081351] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef1f1c54-c40f-435a-a86f-f8c4e951084e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1730.093934] env[68617]: DEBUG nova.compute.manager [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1730.113927] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-ea1854ca-db59-4b97-97e1-e32424d8ed6a tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d could not be found. [ 1730.114119] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ea1854ca-db59-4b97-97e1-e32424d8ed6a tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1730.114352] env[68617]: INFO nova.compute.manager [None req-ea1854ca-db59-4b97-97e1-e32424d8ed6a tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1730.114580] env[68617]: DEBUG oslo.service.loopingcall [None req-ea1854ca-db59-4b97-97e1-e32424d8ed6a tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1730.117015] env[68617]: DEBUG nova.compute.manager [-] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1730.117158] env[68617]: DEBUG nova.network.neutron [-] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1730.143579] env[68617]: DEBUG nova.network.neutron [-] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1730.149171] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1730.152344] env[68617]: INFO nova.compute.manager [-] [instance: 5f31aef1-4806-48e1-9d5a-5dff09ea0f0d] Took 0.04 seconds to deallocate network for instance. [ 1730.227115] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22bd9632-b2f3-4072-9b21-38b2a32c561a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1730.234248] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5858e72d-811f-4afd-9c71-5132b246a40b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1730.266534] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9aede5f-3a7c-4fbb-892b-37cee5139871 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1730.269282] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ea1854ca-db59-4b97-97e1-e32424d8ed6a tempest-ServerMetadataNegativeTestJSON-1551197108 tempest-ServerMetadataNegativeTestJSON-1551197108-project-member] Lock "5f31aef1-4806-48e1-9d5a-5dff09ea0f0d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.203s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1730.275935] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd307f3e-2b46-4cd8-9b8d-d3171a9d20fb {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1730.290891] env[68617]: DEBUG nova.compute.provider_tree [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1730.300032] env[68617]: DEBUG nova.scheduler.client.report [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1730.313183] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.417s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1730.313921] env[68617]: ERROR nova.compute.manager [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Failed to build and run instance: nova.exception.ImageNotAuthorized: Not authorized for image c87eab51-bc9a-44dc-8f0d-7ab73283e453. [ 1730.313921] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Traceback (most recent call last): [ 1730.313921] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1730.313921] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1730.313921] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1730.313921] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] result = getattr(controller, method)(*args, **kwargs) [ 1730.313921] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1730.313921] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return self._get(image_id) [ 1730.313921] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1730.313921] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1730.313921] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1730.314298] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] resp, body = self.http_client.get(url, headers=header) [ 1730.314298] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1730.314298] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return self.request(url, 'GET', **kwargs) [ 1730.314298] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1730.314298] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return self._handle_response(resp) [ 1730.314298] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1730.314298] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] raise exc.from_response(resp, resp.content) [ 1730.314298] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1730.314298] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] [ 1730.314298] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] During handling of the above exception, another exception occurred: [ 1730.314298] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] [ 1730.314298] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Traceback (most recent call last): [ 1730.314729] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1730.314729] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] self.driver.spawn(context, instance, image_meta, [ 1730.314729] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1730.314729] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1730.314729] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1730.314729] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] self._fetch_image_if_missing(context, vi) [ 1730.314729] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1730.314729] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] image_fetch(context, vi, tmp_image_ds_loc) [ 1730.314729] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1730.314729] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] images.fetch_image( [ 1730.314729] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1730.314729] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] metadata = IMAGE_API.get(context, image_ref) [ 1730.314729] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1730.315131] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return session.show(context, image_id, [ 1730.315131] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1730.315131] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] _reraise_translated_image_exception(image_id) [ 1730.315131] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1730.315131] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] raise new_exc.with_traceback(exc_trace) [ 1730.315131] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1730.315131] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1730.315131] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1730.315131] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] result = getattr(controller, method)(*args, **kwargs) [ 1730.315131] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1730.315131] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return self._get(image_id) [ 1730.315131] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1730.315131] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1730.315479] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1730.315479] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] resp, body = self.http_client.get(url, headers=header) [ 1730.315479] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1730.315479] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return self.request(url, 'GET', **kwargs) [ 1730.315479] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1730.315479] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return self._handle_response(resp) [ 1730.315479] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1730.315479] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] raise exc.from_response(resp, resp.content) [ 1730.315479] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] nova.exception.ImageNotAuthorized: Not authorized for image c87eab51-bc9a-44dc-8f0d-7ab73283e453. [ 1730.315479] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] [ 1730.315744] env[68617]: DEBUG nova.compute.utils [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Not authorized for image c87eab51-bc9a-44dc-8f0d-7ab73283e453. {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1730.315744] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.167s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1730.317148] env[68617]: INFO nova.compute.claims [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1730.319857] env[68617]: DEBUG nova.compute.manager [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Build of instance e90877a8-47d3-47d7-8362-5bcfe3a98c36 was re-scheduled: Not authorized for image c87eab51-bc9a-44dc-8f0d-7ab73283e453. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1730.320197] env[68617]: DEBUG nova.compute.manager [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1730.320369] env[68617]: DEBUG nova.compute.manager [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1730.320518] env[68617]: DEBUG nova.compute.manager [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1730.320680] env[68617]: DEBUG nova.network.neutron [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1730.428420] env[68617]: DEBUG neutronclient.v2_0.client [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=68617) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1730.429724] env[68617]: ERROR nova.compute.manager [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1730.429724] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Traceback (most recent call last): [ 1730.429724] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1730.429724] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1730.429724] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1730.429724] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] result = getattr(controller, method)(*args, **kwargs) [ 1730.429724] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1730.429724] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return self._get(image_id) [ 1730.429724] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1730.429724] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1730.429724] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1730.430117] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] resp, body = self.http_client.get(url, headers=header) [ 1730.430117] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1730.430117] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return self.request(url, 'GET', **kwargs) [ 1730.430117] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1730.430117] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return self._handle_response(resp) [ 1730.430117] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1730.430117] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] raise exc.from_response(resp, resp.content) [ 1730.430117] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1730.430117] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] [ 1730.430117] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] During handling of the above exception, another exception occurred: [ 1730.430117] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] [ 1730.430117] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Traceback (most recent call last): [ 1730.430519] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1730.430519] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] self.driver.spawn(context, instance, image_meta, [ 1730.430519] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1730.430519] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1730.430519] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1730.430519] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] self._fetch_image_if_missing(context, vi) [ 1730.430519] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1730.430519] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] image_fetch(context, vi, tmp_image_ds_loc) [ 1730.430519] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1730.430519] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] images.fetch_image( [ 1730.430519] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1730.430519] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] metadata = IMAGE_API.get(context, image_ref) [ 1730.430519] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1730.430892] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return session.show(context, image_id, [ 1730.430892] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1730.430892] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] _reraise_translated_image_exception(image_id) [ 1730.430892] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1730.430892] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] raise new_exc.with_traceback(exc_trace) [ 1730.430892] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1730.430892] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1730.430892] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1730.430892] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] result = getattr(controller, method)(*args, **kwargs) [ 1730.430892] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1730.430892] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return self._get(image_id) [ 1730.430892] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1730.430892] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1730.431275] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1730.431275] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] resp, body = self.http_client.get(url, headers=header) [ 1730.431275] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1730.431275] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return self.request(url, 'GET', **kwargs) [ 1730.431275] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1730.431275] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return self._handle_response(resp) [ 1730.431275] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1730.431275] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] raise exc.from_response(resp, resp.content) [ 1730.431275] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] nova.exception.ImageNotAuthorized: Not authorized for image c87eab51-bc9a-44dc-8f0d-7ab73283e453. [ 1730.431275] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] [ 1730.431275] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] During handling of the above exception, another exception occurred: [ 1730.431275] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] [ 1730.431275] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Traceback (most recent call last): [ 1730.431660] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/compute/manager.py", line 2430, in _do_build_and_run_instance [ 1730.431660] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] self._build_and_run_instance(context, instance, image, [ 1730.431660] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/compute/manager.py", line 2722, in _build_and_run_instance [ 1730.431660] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] raise exception.RescheduledException( [ 1730.431660] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] nova.exception.RescheduledException: Build of instance e90877a8-47d3-47d7-8362-5bcfe3a98c36 was re-scheduled: Not authorized for image c87eab51-bc9a-44dc-8f0d-7ab73283e453. [ 1730.431660] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] [ 1730.431660] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] During handling of the above exception, another exception occurred: [ 1730.431660] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] [ 1730.431660] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Traceback (most recent call last): [ 1730.431660] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1730.431660] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] ret = obj(*args, **kwargs) [ 1730.431660] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1730.431660] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] exception_handler_v20(status_code, error_body) [ 1730.432064] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1730.432064] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] raise client_exc(message=error_message, [ 1730.432064] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1730.432064] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Neutron server returns request_ids: ['req-06af1657-7be1-4616-9d28-a26827d91d7f'] [ 1730.432064] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] [ 1730.432064] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] During handling of the above exception, another exception occurred: [ 1730.432064] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] [ 1730.432064] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Traceback (most recent call last): [ 1730.432064] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/compute/manager.py", line 3019, in _cleanup_allocated_networks [ 1730.432064] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] self._deallocate_network(context, instance, requested_networks) [ 1730.432064] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1730.432064] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] self.network_api.deallocate_for_instance( [ 1730.432064] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1730.432423] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] data = neutron.list_ports(**search_opts) [ 1730.432423] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1730.432423] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] ret = obj(*args, **kwargs) [ 1730.432423] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1730.432423] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return self.list('ports', self.ports_path, retrieve_all, [ 1730.432423] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1730.432423] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] ret = obj(*args, **kwargs) [ 1730.432423] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1730.432423] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] for r in self._pagination(collection, path, **params): [ 1730.432423] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1730.432423] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] res = self.get(path, params=params) [ 1730.432423] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1730.432423] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] ret = obj(*args, **kwargs) [ 1730.432734] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1730.432734] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return self.retry_request("GET", action, body=body, [ 1730.432734] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1730.432734] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] ret = obj(*args, **kwargs) [ 1730.432734] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1730.432734] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return self.do_request(method, action, body=body, [ 1730.432734] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1730.432734] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] ret = obj(*args, **kwargs) [ 1730.432734] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1730.432734] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] self._handle_fault_response(status_code, replybody, resp) [ 1730.432734] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1730.432734] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] raise exception.Unauthorized() [ 1730.432734] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] nova.exception.Unauthorized: Not authorized. [ 1730.433061] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] [ 1730.480888] env[68617]: INFO nova.scheduler.client.report [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Deleted allocations for instance e90877a8-47d3-47d7-8362-5bcfe3a98c36 [ 1730.498514] env[68617]: DEBUG oslo_concurrency.lockutils [None req-5f9540c5-ea1e-440d-8dc8-b27bb47bb03b tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Lock "e90877a8-47d3-47d7-8362-5bcfe3a98c36" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 589.041s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1730.499808] env[68617]: DEBUG oslo_concurrency.lockutils [None req-dd4e098a-0a0c-4e84-aac8-2dcaf30a9e2f tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Lock "e90877a8-47d3-47d7-8362-5bcfe3a98c36" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 392.814s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1730.499899] env[68617]: DEBUG oslo_concurrency.lockutils [None req-dd4e098a-0a0c-4e84-aac8-2dcaf30a9e2f tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Acquiring lock "e90877a8-47d3-47d7-8362-5bcfe3a98c36-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1730.500091] env[68617]: DEBUG oslo_concurrency.lockutils [None req-dd4e098a-0a0c-4e84-aac8-2dcaf30a9e2f tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Lock "e90877a8-47d3-47d7-8362-5bcfe3a98c36-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1730.500268] env[68617]: DEBUG oslo_concurrency.lockutils [None req-dd4e098a-0a0c-4e84-aac8-2dcaf30a9e2f tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Lock "e90877a8-47d3-47d7-8362-5bcfe3a98c36-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1730.504436] env[68617]: INFO nova.compute.manager [None req-dd4e098a-0a0c-4e84-aac8-2dcaf30a9e2f tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Terminating instance [ 1730.506103] env[68617]: DEBUG nova.compute.manager [None req-dd4e098a-0a0c-4e84-aac8-2dcaf30a9e2f tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1730.506399] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-dd4e098a-0a0c-4e84-aac8-2dcaf30a9e2f tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1730.506846] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-eb7b0107-2b97-4eba-b5d0-58d5ac478721 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1730.516487] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03ce6d65-2858-4d25-b8d8-7c38e8a42b84 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1730.526604] env[68617]: DEBUG nova.compute.manager [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1730.548451] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-dd4e098a-0a0c-4e84-aac8-2dcaf30a9e2f tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e90877a8-47d3-47d7-8362-5bcfe3a98c36 could not be found. [ 1730.548702] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-dd4e098a-0a0c-4e84-aac8-2dcaf30a9e2f tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1730.548890] env[68617]: INFO nova.compute.manager [None req-dd4e098a-0a0c-4e84-aac8-2dcaf30a9e2f tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1730.549147] env[68617]: DEBUG oslo.service.loopingcall [None req-dd4e098a-0a0c-4e84-aac8-2dcaf30a9e2f tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1730.551603] env[68617]: DEBUG nova.compute.manager [-] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1730.551706] env[68617]: DEBUG nova.network.neutron [-] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1730.585011] env[68617]: DEBUG oslo_concurrency.lockutils [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1730.586928] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3c7c8f3-fad2-4f9a-9c1e-c8d113601156 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1730.597518] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4bfbf441-95c6-4766-a4ed-5460f2e716d1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1730.629526] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1f3d12f-c775-4633-b28a-9b2a4ecb169e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1730.636753] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef53ea2e-efcd-4883-9868-ecce89b419c8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1730.650083] env[68617]: DEBUG nova.compute.provider_tree [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1730.658547] env[68617]: DEBUG nova.scheduler.client.report [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1730.673833] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.357s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1730.673833] env[68617]: DEBUG nova.compute.manager [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1730.676650] env[68617]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=68617) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1730.676879] env[68617]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1730.677421] env[68617]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1730.677421] env[68617]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1730.677421] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1730.677421] env[68617]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1730.677421] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1730.677421] env[68617]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1730.677421] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1730.677421] env[68617]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1730.677421] env[68617]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1730.677421] env[68617]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-3b677fc1-e555-4e95-984b-8384625407bd'] [ 1730.677421] env[68617]: ERROR oslo.service.loopingcall [ 1730.677421] env[68617]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1730.677421] env[68617]: ERROR oslo.service.loopingcall [ 1730.677421] env[68617]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1730.677421] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1730.677421] env[68617]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1730.677921] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1730.677921] env[68617]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1730.677921] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1730.677921] env[68617]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1730.677921] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1730.677921] env[68617]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1730.677921] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1730.677921] env[68617]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1730.677921] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1730.677921] env[68617]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1730.677921] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1730.677921] env[68617]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1730.677921] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1730.677921] env[68617]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1730.677921] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1730.677921] env[68617]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1730.677921] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1730.677921] env[68617]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1730.678508] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1730.678508] env[68617]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1730.678508] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1730.678508] env[68617]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1730.678508] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1730.678508] env[68617]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1730.678508] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1730.678508] env[68617]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1730.678508] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1730.678508] env[68617]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1730.678508] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1730.678508] env[68617]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1730.678508] env[68617]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1730.678508] env[68617]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1730.678508] env[68617]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1730.678508] env[68617]: ERROR oslo.service.loopingcall [ 1730.678973] env[68617]: DEBUG oslo_concurrency.lockutils [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.093s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1730.679751] env[68617]: INFO nova.compute.claims [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1730.681974] env[68617]: ERROR nova.compute.manager [None req-dd4e098a-0a0c-4e84-aac8-2dcaf30a9e2f tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1730.727570] env[68617]: DEBUG nova.compute.utils [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1730.728791] env[68617]: DEBUG nova.compute.manager [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1730.728962] env[68617]: DEBUG nova.network.neutron [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1730.743276] env[68617]: ERROR nova.compute.manager [None req-dd4e098a-0a0c-4e84-aac8-2dcaf30a9e2f tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1730.743276] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Traceback (most recent call last): [ 1730.743276] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1730.743276] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] ret = obj(*args, **kwargs) [ 1730.743276] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1730.743276] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] exception_handler_v20(status_code, error_body) [ 1730.743276] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1730.743276] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] raise client_exc(message=error_message, [ 1730.743276] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1730.743276] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Neutron server returns request_ids: ['req-3b677fc1-e555-4e95-984b-8384625407bd'] [ 1730.743761] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] [ 1730.743761] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] During handling of the above exception, another exception occurred: [ 1730.743761] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] [ 1730.743761] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Traceback (most recent call last): [ 1730.743761] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1730.743761] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] self._delete_instance(context, instance, bdms) [ 1730.743761] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1730.743761] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] self._shutdown_instance(context, instance, bdms) [ 1730.743761] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1730.743761] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] self._try_deallocate_network(context, instance, requested_networks) [ 1730.743761] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1730.743761] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] with excutils.save_and_reraise_exception(): [ 1730.743761] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1730.743761] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] self.force_reraise() [ 1730.744373] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1730.744373] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] raise self.value [ 1730.744373] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1730.744373] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] _deallocate_network_with_retries() [ 1730.744373] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1730.744373] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return evt.wait() [ 1730.744373] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1730.744373] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] result = hub.switch() [ 1730.744373] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1730.744373] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return self.greenlet.switch() [ 1730.744373] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1730.744373] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] result = func(*self.args, **self.kw) [ 1730.744792] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1730.744792] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] result = f(*args, **kwargs) [ 1730.744792] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1730.744792] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] self._deallocate_network( [ 1730.744792] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1730.744792] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] self.network_api.deallocate_for_instance( [ 1730.744792] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1730.744792] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] data = neutron.list_ports(**search_opts) [ 1730.744792] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1730.744792] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] ret = obj(*args, **kwargs) [ 1730.744792] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1730.744792] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return self.list('ports', self.ports_path, retrieve_all, [ 1730.744792] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1730.745215] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] ret = obj(*args, **kwargs) [ 1730.745215] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1730.745215] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] for r in self._pagination(collection, path, **params): [ 1730.745215] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1730.745215] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] res = self.get(path, params=params) [ 1730.745215] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1730.745215] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] ret = obj(*args, **kwargs) [ 1730.745215] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1730.745215] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return self.retry_request("GET", action, body=body, [ 1730.745215] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1730.745215] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] ret = obj(*args, **kwargs) [ 1730.745215] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1730.745215] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] return self.do_request(method, action, body=body, [ 1730.745583] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1730.745583] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] ret = obj(*args, **kwargs) [ 1730.745583] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1730.745583] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] self._handle_fault_response(status_code, replybody, resp) [ 1730.745583] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1730.745583] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1730.745583] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1730.745583] env[68617]: ERROR nova.compute.manager [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] [ 1730.754016] env[68617]: DEBUG nova.compute.manager [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1730.781985] env[68617]: DEBUG oslo_concurrency.lockutils [None req-dd4e098a-0a0c-4e84-aac8-2dcaf30a9e2f tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Lock "e90877a8-47d3-47d7-8362-5bcfe3a98c36" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.282s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1730.799663] env[68617]: DEBUG nova.policy [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c53a173b19c845b2b064ee99e02c892b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'edf8c85438704553aa2677189ea375f2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 1730.819816] env[68617]: DEBUG nova.compute.manager [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1730.844410] env[68617]: DEBUG nova.virt.hardware [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1730.844582] env[68617]: DEBUG nova.virt.hardware [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1730.844726] env[68617]: DEBUG nova.virt.hardware [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1730.844908] env[68617]: DEBUG nova.virt.hardware [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1730.845070] env[68617]: DEBUG nova.virt.hardware [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1730.845219] env[68617]: DEBUG nova.virt.hardware [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1730.845428] env[68617]: DEBUG nova.virt.hardware [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1730.845588] env[68617]: DEBUG nova.virt.hardware [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1730.845753] env[68617]: DEBUG nova.virt.hardware [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1730.845913] env[68617]: DEBUG nova.virt.hardware [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1730.846094] env[68617]: DEBUG nova.virt.hardware [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1730.846972] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a8b5b62-f593-4fd0-beb8-e033ee82c31f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1730.852851] env[68617]: INFO nova.compute.manager [None req-dd4e098a-0a0c-4e84-aac8-2dcaf30a9e2f tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] [instance: e90877a8-47d3-47d7-8362-5bcfe3a98c36] Successfully reverted task state from None on failure for instance. [ 1730.856776] env[68617]: ERROR oslo_messaging.rpc.server [None req-dd4e098a-0a0c-4e84-aac8-2dcaf30a9e2f tempest-DeleteServersAdminTestJSON-1248100135 tempest-DeleteServersAdminTestJSON-1248100135-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1730.856776] env[68617]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1730.856776] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1730.856776] env[68617]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1730.856776] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1730.856776] env[68617]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1730.856776] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1730.856776] env[68617]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1730.856776] env[68617]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1730.856776] env[68617]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-3b677fc1-e555-4e95-984b-8384625407bd'] [ 1730.856776] env[68617]: ERROR oslo_messaging.rpc.server [ 1730.856776] env[68617]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1730.856776] env[68617]: ERROR oslo_messaging.rpc.server [ 1730.856776] env[68617]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1730.856776] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1730.857268] env[68617]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1730.857268] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1730.857268] env[68617]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1730.857268] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1730.857268] env[68617]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1730.857268] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1730.857268] env[68617]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1730.857268] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1730.857268] env[68617]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1730.857268] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1730.857268] env[68617]: ERROR oslo_messaging.rpc.server raise self.value [ 1730.857268] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1730.857268] env[68617]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1730.857268] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1730.857268] env[68617]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1730.857268] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1730.857268] env[68617]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1730.857268] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1730.857838] env[68617]: ERROR oslo_messaging.rpc.server raise self.value [ 1730.857838] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1730.857838] env[68617]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1730.857838] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1730.857838] env[68617]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1730.857838] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1730.857838] env[68617]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1730.857838] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1730.857838] env[68617]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1730.857838] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1730.857838] env[68617]: ERROR oslo_messaging.rpc.server raise self.value [ 1730.857838] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1730.857838] env[68617]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1730.857838] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1730.857838] env[68617]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1730.857838] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1730.857838] env[68617]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1730.857838] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1730.858374] env[68617]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1730.858374] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1730.858374] env[68617]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1730.858374] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1730.858374] env[68617]: ERROR oslo_messaging.rpc.server raise self.value [ 1730.858374] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1730.858374] env[68617]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1730.858374] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1730.858374] env[68617]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1730.858374] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1730.858374] env[68617]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1730.858374] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1730.858374] env[68617]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1730.858374] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1730.858374] env[68617]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1730.858374] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1730.858374] env[68617]: ERROR oslo_messaging.rpc.server raise self.value [ 1730.858374] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1730.858895] env[68617]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1730.858895] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1730.858895] env[68617]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1730.858895] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1730.858895] env[68617]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1730.858895] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1730.858895] env[68617]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1730.858895] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1730.858895] env[68617]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1730.858895] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1730.858895] env[68617]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1730.858895] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1730.858895] env[68617]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1730.858895] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1730.858895] env[68617]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1730.858895] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1730.858895] env[68617]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1730.858895] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1730.859363] env[68617]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1730.859363] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1730.859363] env[68617]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1730.859363] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1730.859363] env[68617]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1730.859363] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1730.859363] env[68617]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1730.859363] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1730.859363] env[68617]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1730.859363] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1730.859363] env[68617]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1730.859363] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1730.859363] env[68617]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1730.859363] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1730.859363] env[68617]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1730.859363] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1730.859363] env[68617]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1730.859363] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1730.859859] env[68617]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1730.859859] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1730.859859] env[68617]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1730.859859] env[68617]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1730.859859] env[68617]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1730.859859] env[68617]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1730.859859] env[68617]: ERROR oslo_messaging.rpc.server [ 1730.862367] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a459da90-4e99-4084-bd6a-d2e156cc3853 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1730.933314] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c20ef77-5e4e-4e76-980c-cf9cadf35d5a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1730.940388] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a01e8437-58a2-4515-9c61-a87cf2164762 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1730.970756] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-610ddd00-b44f-4cd2-9ccc-13a618319a7d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1730.977947] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eee7333f-ab9f-4e1c-a196-dc61d5346cce {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1730.990761] env[68617]: DEBUG nova.compute.provider_tree [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1730.999335] env[68617]: DEBUG nova.scheduler.client.report [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1731.013757] env[68617]: DEBUG oslo_concurrency.lockutils [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.336s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1731.014236] env[68617]: DEBUG nova.compute.manager [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1731.047815] env[68617]: DEBUG nova.compute.utils [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1731.049238] env[68617]: DEBUG nova.compute.manager [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1731.049409] env[68617]: DEBUG nova.network.neutron [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1731.059387] env[68617]: DEBUG nova.compute.manager [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1731.123981] env[68617]: DEBUG nova.compute.manager [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1731.142535] env[68617]: DEBUG nova.policy [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4d10d70d150e4edfb265d8978e1012c9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '14d635d1e7ad4d00ab58a4678eb8d150', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 1731.147016] env[68617]: DEBUG nova.virt.hardware [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1731.147271] env[68617]: DEBUG nova.virt.hardware [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1731.147456] env[68617]: DEBUG nova.virt.hardware [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1731.148460] env[68617]: DEBUG nova.virt.hardware [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1731.148643] env[68617]: DEBUG nova.virt.hardware [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1731.148811] env[68617]: DEBUG nova.virt.hardware [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1731.149128] env[68617]: DEBUG nova.virt.hardware [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1731.149200] env[68617]: DEBUG nova.virt.hardware [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1731.149365] env[68617]: DEBUG nova.virt.hardware [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1731.149523] env[68617]: DEBUG nova.virt.hardware [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1731.149697] env[68617]: DEBUG nova.virt.hardware [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1731.152845] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2a129fa-8aec-496c-b13d-e45ff8a095c3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1731.160638] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e11391f-9061-4f14-a314-fe143168fc05 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1731.184259] env[68617]: DEBUG nova.network.neutron [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Successfully created port: a687866e-c38b-4030-95d5-95827823ff0a {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1731.622647] env[68617]: DEBUG nova.network.neutron [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Successfully created port: e803f1b3-4fc7-49a2-b659-e7d6147bff02 {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1731.698901] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1731.699115] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Starting heal instance info cache {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1731.699207] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Rebuilding the list of instances to heal {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1731.726419] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1731.726591] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1731.726723] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1731.726848] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1731.726971] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1731.727174] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1731.727384] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1731.727457] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1731.727542] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1731.727659] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1731.727777] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Didn't find any instances for network info cache update. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1732.193054] env[68617]: DEBUG nova.compute.manager [req-16a32892-de45-4ecf-996b-f48d67775e65 req-17bfbab5-bdcd-46c5-af86-36a2984111f3 service nova] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Received event network-vif-plugged-a687866e-c38b-4030-95d5-95827823ff0a {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1732.193364] env[68617]: DEBUG oslo_concurrency.lockutils [req-16a32892-de45-4ecf-996b-f48d67775e65 req-17bfbab5-bdcd-46c5-af86-36a2984111f3 service nova] Acquiring lock "21d0560a-fde3-4c16-b2fc-06d6f8668a7a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1732.193625] env[68617]: DEBUG oslo_concurrency.lockutils [req-16a32892-de45-4ecf-996b-f48d67775e65 req-17bfbab5-bdcd-46c5-af86-36a2984111f3 service nova] Lock "21d0560a-fde3-4c16-b2fc-06d6f8668a7a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1732.193766] env[68617]: DEBUG oslo_concurrency.lockutils [req-16a32892-de45-4ecf-996b-f48d67775e65 req-17bfbab5-bdcd-46c5-af86-36a2984111f3 service nova] Lock "21d0560a-fde3-4c16-b2fc-06d6f8668a7a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1732.193931] env[68617]: DEBUG nova.compute.manager [req-16a32892-de45-4ecf-996b-f48d67775e65 req-17bfbab5-bdcd-46c5-af86-36a2984111f3 service nova] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] No waiting events found dispatching network-vif-plugged-a687866e-c38b-4030-95d5-95827823ff0a {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1732.194135] env[68617]: WARNING nova.compute.manager [req-16a32892-de45-4ecf-996b-f48d67775e65 req-17bfbab5-bdcd-46c5-af86-36a2984111f3 service nova] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Received unexpected event network-vif-plugged-a687866e-c38b-4030-95d5-95827823ff0a for instance with vm_state building and task_state spawning. [ 1732.290641] env[68617]: DEBUG nova.network.neutron [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Successfully updated port: a687866e-c38b-4030-95d5-95827823ff0a {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1732.304851] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Acquiring lock "refresh_cache-21d0560a-fde3-4c16-b2fc-06d6f8668a7a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1732.304851] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Acquired lock "refresh_cache-21d0560a-fde3-4c16-b2fc-06d6f8668a7a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1732.304989] env[68617]: DEBUG nova.network.neutron [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1732.354622] env[68617]: DEBUG nova.network.neutron [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1732.657467] env[68617]: DEBUG nova.network.neutron [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Updating instance_info_cache with network_info: [{"id": "a687866e-c38b-4030-95d5-95827823ff0a", "address": "fa:16:3e:53:f5:ca", "network": {"id": "7b552a3e-6a7c-4a2f-92ae-0398b244b248", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1354374474-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "edf8c85438704553aa2677189ea375f2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "89ed4797-90ad-44cd-bbcb-e90b2a8400f3", "external-id": "nsx-vlan-transportzone-699", "segmentation_id": 699, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa687866e-c3", "ovs_interfaceid": "a687866e-c38b-4030-95d5-95827823ff0a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1732.669813] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Releasing lock "refresh_cache-21d0560a-fde3-4c16-b2fc-06d6f8668a7a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1732.670193] env[68617]: DEBUG nova.compute.manager [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Instance network_info: |[{"id": "a687866e-c38b-4030-95d5-95827823ff0a", "address": "fa:16:3e:53:f5:ca", "network": {"id": "7b552a3e-6a7c-4a2f-92ae-0398b244b248", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1354374474-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "edf8c85438704553aa2677189ea375f2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "89ed4797-90ad-44cd-bbcb-e90b2a8400f3", "external-id": "nsx-vlan-transportzone-699", "segmentation_id": 699, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa687866e-c3", "ovs_interfaceid": "a687866e-c38b-4030-95d5-95827823ff0a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1732.670619] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:53:f5:ca', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '89ed4797-90ad-44cd-bbcb-e90b2a8400f3', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a687866e-c38b-4030-95d5-95827823ff0a', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1732.678456] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Creating folder: Project (edf8c85438704553aa2677189ea375f2). Parent ref: group-v693691. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1732.679031] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-cc41af45-993b-4695-bc1b-bad3383716ae {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1732.689821] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Created folder: Project (edf8c85438704553aa2677189ea375f2) in parent group-v693691. [ 1732.690037] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Creating folder: Instances. Parent ref: group-v693784. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1732.690287] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6c0087cc-b6a8-44d1-a70a-1e1045170839 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1732.697662] env[68617]: DEBUG nova.network.neutron [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Successfully updated port: e803f1b3-4fc7-49a2-b659-e7d6147bff02 {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1732.699861] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Created folder: Instances in parent group-v693784. [ 1732.700267] env[68617]: DEBUG oslo.service.loopingcall [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1732.700714] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1732.700935] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a38c74d2-48b7-4974-b8ef-4158901cac1a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1732.716760] env[68617]: DEBUG oslo_concurrency.lockutils [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Acquiring lock "refresh_cache-902b5ab9-23b8-450f-853a-b2da889c3afd" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1732.717157] env[68617]: DEBUG oslo_concurrency.lockutils [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Acquired lock "refresh_cache-902b5ab9-23b8-450f-853a-b2da889c3afd" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1732.717157] env[68617]: DEBUG nova.network.neutron [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1732.723259] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1732.723259] env[68617]: value = "task-3470861" [ 1732.723259] env[68617]: _type = "Task" [ 1732.723259] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1732.737844] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470861, 'name': CreateVM_Task} progress is 5%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1732.760616] env[68617]: DEBUG nova.network.neutron [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1732.938249] env[68617]: DEBUG nova.network.neutron [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Updating instance_info_cache with network_info: [{"id": "e803f1b3-4fc7-49a2-b659-e7d6147bff02", "address": "fa:16:3e:7b:3b:db", "network": {"id": "04212500-0e8c-4ed8-bf43-34e68c0b98fe", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-628999513-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "14d635d1e7ad4d00ab58a4678eb8d150", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bed837fa-6b6a-4192-a229-a99426a46065", "external-id": "nsx-vlan-transportzone-954", "segmentation_id": 954, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape803f1b3-4f", "ovs_interfaceid": "e803f1b3-4fc7-49a2-b659-e7d6147bff02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1732.953589] env[68617]: DEBUG oslo_concurrency.lockutils [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Releasing lock "refresh_cache-902b5ab9-23b8-450f-853a-b2da889c3afd" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1732.954043] env[68617]: DEBUG nova.compute.manager [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Instance network_info: |[{"id": "e803f1b3-4fc7-49a2-b659-e7d6147bff02", "address": "fa:16:3e:7b:3b:db", "network": {"id": "04212500-0e8c-4ed8-bf43-34e68c0b98fe", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-628999513-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "14d635d1e7ad4d00ab58a4678eb8d150", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bed837fa-6b6a-4192-a229-a99426a46065", "external-id": "nsx-vlan-transportzone-954", "segmentation_id": 954, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape803f1b3-4f", "ovs_interfaceid": "e803f1b3-4fc7-49a2-b659-e7d6147bff02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1732.954823] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:7b:3b:db', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'bed837fa-6b6a-4192-a229-a99426a46065', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e803f1b3-4fc7-49a2-b659-e7d6147bff02', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1732.963127] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Creating folder: Project (14d635d1e7ad4d00ab58a4678eb8d150). Parent ref: group-v693691. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1732.963958] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-82876296-9e24-42fb-839c-6744ea4c0092 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1732.977016] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Created folder: Project (14d635d1e7ad4d00ab58a4678eb8d150) in parent group-v693691. [ 1732.977240] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Creating folder: Instances. Parent ref: group-v693787. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1732.977500] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fe2df7ae-1164-49f5-a1ea-fe3a75fa15a9 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1732.986771] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Created folder: Instances in parent group-v693787. [ 1732.987026] env[68617]: DEBUG oslo.service.loopingcall [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1732.987231] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1732.987481] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-10caedb8-3b40-49c1-90c5-fbbdf5cb9f09 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1733.007020] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1733.007020] env[68617]: value = "task-3470864" [ 1733.007020] env[68617]: _type = "Task" [ 1733.007020] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1733.016925] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470864, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1733.234977] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470861, 'name': CreateVM_Task, 'duration_secs': 0.347048} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1733.235345] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1733.235799] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1733.235969] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1733.236341] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1733.236599] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4dd36764-645d-4600-ade3-a913397c6292 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1733.241096] env[68617]: DEBUG oslo_vmware.api [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Waiting for the task: (returnval){ [ 1733.241096] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52a828e3-b895-8bd9-ac79-e9f252555305" [ 1733.241096] env[68617]: _type = "Task" [ 1733.241096] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1733.248801] env[68617]: DEBUG oslo_vmware.api [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52a828e3-b895-8bd9-ac79-e9f252555305, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1733.516882] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470864, 'name': CreateVM_Task, 'duration_secs': 0.352967} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1733.517064] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1733.517689] env[68617]: DEBUG oslo_concurrency.lockutils [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1733.753506] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1733.753725] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1733.753949] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1733.754192] env[68617]: DEBUG oslo_concurrency.lockutils [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1733.754521] env[68617]: DEBUG oslo_concurrency.lockutils [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1733.754788] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-24bf87de-bc54-4593-8e18-f5504fe1ad75 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1733.760716] env[68617]: DEBUG oslo_vmware.api [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Waiting for the task: (returnval){ [ 1733.760716] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52fe356f-0b14-db46-8700-d98e2e16cca5" [ 1733.760716] env[68617]: _type = "Task" [ 1733.760716] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1733.769893] env[68617]: DEBUG oslo_vmware.api [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52fe356f-0b14-db46-8700-d98e2e16cca5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1734.225141] env[68617]: DEBUG nova.compute.manager [req-d2826553-2041-4a0b-88fb-a91e468b4307 req-d254cb0d-1a1a-47b4-9dca-6352889d004a service nova] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Received event network-changed-a687866e-c38b-4030-95d5-95827823ff0a {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1734.225325] env[68617]: DEBUG nova.compute.manager [req-d2826553-2041-4a0b-88fb-a91e468b4307 req-d254cb0d-1a1a-47b4-9dca-6352889d004a service nova] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Refreshing instance network info cache due to event network-changed-a687866e-c38b-4030-95d5-95827823ff0a. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1734.225504] env[68617]: DEBUG oslo_concurrency.lockutils [req-d2826553-2041-4a0b-88fb-a91e468b4307 req-d254cb0d-1a1a-47b4-9dca-6352889d004a service nova] Acquiring lock "refresh_cache-21d0560a-fde3-4c16-b2fc-06d6f8668a7a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1734.225622] env[68617]: DEBUG oslo_concurrency.lockutils [req-d2826553-2041-4a0b-88fb-a91e468b4307 req-d254cb0d-1a1a-47b4-9dca-6352889d004a service nova] Acquired lock "refresh_cache-21d0560a-fde3-4c16-b2fc-06d6f8668a7a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1734.225914] env[68617]: DEBUG nova.network.neutron [req-d2826553-2041-4a0b-88fb-a91e468b4307 req-d254cb0d-1a1a-47b4-9dca-6352889d004a service nova] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Refreshing network info cache for port a687866e-c38b-4030-95d5-95827823ff0a {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1734.274464] env[68617]: DEBUG oslo_concurrency.lockutils [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1734.274778] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1734.274993] env[68617]: DEBUG oslo_concurrency.lockutils [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1734.499326] env[68617]: DEBUG nova.network.neutron [req-d2826553-2041-4a0b-88fb-a91e468b4307 req-d254cb0d-1a1a-47b4-9dca-6352889d004a service nova] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Updated VIF entry in instance network info cache for port a687866e-c38b-4030-95d5-95827823ff0a. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1734.499760] env[68617]: DEBUG nova.network.neutron [req-d2826553-2041-4a0b-88fb-a91e468b4307 req-d254cb0d-1a1a-47b4-9dca-6352889d004a service nova] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Updating instance_info_cache with network_info: [{"id": "a687866e-c38b-4030-95d5-95827823ff0a", "address": "fa:16:3e:53:f5:ca", "network": {"id": "7b552a3e-6a7c-4a2f-92ae-0398b244b248", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1354374474-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "edf8c85438704553aa2677189ea375f2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "89ed4797-90ad-44cd-bbcb-e90b2a8400f3", "external-id": "nsx-vlan-transportzone-699", "segmentation_id": 699, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa687866e-c3", "ovs_interfaceid": "a687866e-c38b-4030-95d5-95827823ff0a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1734.510833] env[68617]: DEBUG oslo_concurrency.lockutils [req-d2826553-2041-4a0b-88fb-a91e468b4307 req-d254cb0d-1a1a-47b4-9dca-6352889d004a service nova] Releasing lock "refresh_cache-21d0560a-fde3-4c16-b2fc-06d6f8668a7a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1734.511131] env[68617]: DEBUG nova.compute.manager [req-d2826553-2041-4a0b-88fb-a91e468b4307 req-d254cb0d-1a1a-47b4-9dca-6352889d004a service nova] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Received event network-vif-plugged-e803f1b3-4fc7-49a2-b659-e7d6147bff02 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1734.511330] env[68617]: DEBUG oslo_concurrency.lockutils [req-d2826553-2041-4a0b-88fb-a91e468b4307 req-d254cb0d-1a1a-47b4-9dca-6352889d004a service nova] Acquiring lock "902b5ab9-23b8-450f-853a-b2da889c3afd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1734.511533] env[68617]: DEBUG oslo_concurrency.lockutils [req-d2826553-2041-4a0b-88fb-a91e468b4307 req-d254cb0d-1a1a-47b4-9dca-6352889d004a service nova] Lock "902b5ab9-23b8-450f-853a-b2da889c3afd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1734.511691] env[68617]: DEBUG oslo_concurrency.lockutils [req-d2826553-2041-4a0b-88fb-a91e468b4307 req-d254cb0d-1a1a-47b4-9dca-6352889d004a service nova] Lock "902b5ab9-23b8-450f-853a-b2da889c3afd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1734.511855] env[68617]: DEBUG nova.compute.manager [req-d2826553-2041-4a0b-88fb-a91e468b4307 req-d254cb0d-1a1a-47b4-9dca-6352889d004a service nova] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] No waiting events found dispatching network-vif-plugged-e803f1b3-4fc7-49a2-b659-e7d6147bff02 {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1734.512024] env[68617]: WARNING nova.compute.manager [req-d2826553-2041-4a0b-88fb-a91e468b4307 req-d254cb0d-1a1a-47b4-9dca-6352889d004a service nova] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Received unexpected event network-vif-plugged-e803f1b3-4fc7-49a2-b659-e7d6147bff02 for instance with vm_state building and task_state spawning. [ 1734.512190] env[68617]: DEBUG nova.compute.manager [req-d2826553-2041-4a0b-88fb-a91e468b4307 req-d254cb0d-1a1a-47b4-9dca-6352889d004a service nova] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Received event network-changed-e803f1b3-4fc7-49a2-b659-e7d6147bff02 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1734.512358] env[68617]: DEBUG nova.compute.manager [req-d2826553-2041-4a0b-88fb-a91e468b4307 req-d254cb0d-1a1a-47b4-9dca-6352889d004a service nova] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Refreshing instance network info cache due to event network-changed-e803f1b3-4fc7-49a2-b659-e7d6147bff02. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1734.512517] env[68617]: DEBUG oslo_concurrency.lockutils [req-d2826553-2041-4a0b-88fb-a91e468b4307 req-d254cb0d-1a1a-47b4-9dca-6352889d004a service nova] Acquiring lock "refresh_cache-902b5ab9-23b8-450f-853a-b2da889c3afd" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1734.512668] env[68617]: DEBUG oslo_concurrency.lockutils [req-d2826553-2041-4a0b-88fb-a91e468b4307 req-d254cb0d-1a1a-47b4-9dca-6352889d004a service nova] Acquired lock "refresh_cache-902b5ab9-23b8-450f-853a-b2da889c3afd" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1734.512804] env[68617]: DEBUG nova.network.neutron [req-d2826553-2041-4a0b-88fb-a91e468b4307 req-d254cb0d-1a1a-47b4-9dca-6352889d004a service nova] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Refreshing network info cache for port e803f1b3-4fc7-49a2-b659-e7d6147bff02 {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1734.699147] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1734.729349] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1734.766029] env[68617]: DEBUG nova.network.neutron [req-d2826553-2041-4a0b-88fb-a91e468b4307 req-d254cb0d-1a1a-47b4-9dca-6352889d004a service nova] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Updated VIF entry in instance network info cache for port e803f1b3-4fc7-49a2-b659-e7d6147bff02. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1734.766029] env[68617]: DEBUG nova.network.neutron [req-d2826553-2041-4a0b-88fb-a91e468b4307 req-d254cb0d-1a1a-47b4-9dca-6352889d004a service nova] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Updating instance_info_cache with network_info: [{"id": "e803f1b3-4fc7-49a2-b659-e7d6147bff02", "address": "fa:16:3e:7b:3b:db", "network": {"id": "04212500-0e8c-4ed8-bf43-34e68c0b98fe", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-628999513-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "14d635d1e7ad4d00ab58a4678eb8d150", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bed837fa-6b6a-4192-a229-a99426a46065", "external-id": "nsx-vlan-transportzone-954", "segmentation_id": 954, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape803f1b3-4f", "ovs_interfaceid": "e803f1b3-4fc7-49a2-b659-e7d6147bff02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1734.774354] env[68617]: DEBUG oslo_concurrency.lockutils [req-d2826553-2041-4a0b-88fb-a91e468b4307 req-d254cb0d-1a1a-47b4-9dca-6352889d004a service nova] Releasing lock "refresh_cache-902b5ab9-23b8-450f-853a-b2da889c3afd" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1735.387227] env[68617]: DEBUG oslo_concurrency.lockutils [None req-55c4f4bc-0d05-4fce-9b83-cc0321f57d66 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "f54002b0-d60e-44ff-82a5-ef2f5193c48c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1735.698325] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1735.698576] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1735.698733] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1736.699342] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1736.699613] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68617) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1737.999185] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1f230113-be24-433b-be89-74b3d5b0776e tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Acquiring lock "2c950cba-7698-48e0-8852-bf569f58f967" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1739.699239] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1739.710644] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1739.710860] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1739.711042] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1739.711205] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1739.712299] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7348f4d6-75ff-4874-bf4b-63720b9270c0 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1739.722100] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40ac7176-4238-4773-b904-8c760fdbca36 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1739.735504] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fcc70915-b2a2-4d1e-9504-c8e0d34f15fc {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1739.741538] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5db97ff6-bde3-4ae6-8dc9-acd4e6d559ca {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1739.769258] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180906MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1739.769437] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1739.769589] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1739.844665] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f03b9bc5-9438-4c0c-b595-72c631bece08 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1739.844827] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance ee6efd93-25be-4268-afe9-ba39e543a4fb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1739.844956] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 1605028f-5d6d-4ac4-8416-c0465982c53a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1739.845097] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance fc1043b8-535d-4af0-b92b-1f43580cdc9a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1739.845221] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1739.845340] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f54002b0-d60e-44ff-82a5-ef2f5193c48c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1739.845455] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 2c950cba-7698-48e0-8852-bf569f58f967 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1739.845571] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 12ed2a40-3d74-49a2-95b4-ccaaf58c8060 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1739.845684] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 21d0560a-fde3-4c16-b2fc-06d6f8668a7a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1739.845797] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 902b5ab9-23b8-450f-853a-b2da889c3afd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1739.857263] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 922c8926-c636-4463-85d6-4f2a6325b85a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1739.869044] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b1a8dc60-af98-4f80-96cf-b2550ea8c13a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1739.879094] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance a4ab788d-327a-47cc-8ae7-e1b9be889759 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1739.889266] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance fe0d64a6-6ce6-4ef5-8ae1-a160c5ec0987 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1739.889510] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1739.889617] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1740.046121] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05f9bb84-fe74-499b-aabd-46d063cb5253 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1740.053572] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76150063-086d-49b0-a028-5be722e7b22d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1740.085017] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0fdfdd0-71d6-48a2-a2a6-1107de9c52be {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1740.092047] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0fd5112-4172-4676-b462-ef579951cf33 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1740.105116] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1740.113427] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1740.126641] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1740.127106] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.357s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1741.127225] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1741.694200] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1745.225675] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2c94f0c7-c213-4bc7-8690-5eb28181b87f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquiring lock "12ed2a40-3d74-49a2-95b4-ccaaf58c8060" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1754.363159] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c5e2dd77-afa9-470f-9b6b-7c9598bbad7d tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Acquiring lock "21d0560a-fde3-4c16-b2fc-06d6f8668a7a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1758.611518] env[68617]: DEBUG oslo_concurrency.lockutils [None req-054eb420-90e6-46b5-993a-5914b3863296 tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Acquiring lock "902b5ab9-23b8-450f-853a-b2da889c3afd" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1779.995915] env[68617]: WARNING oslo_vmware.rw_handles [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1779.995915] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1779.995915] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1779.995915] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1779.995915] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1779.995915] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 1779.995915] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1779.995915] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1779.995915] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1779.995915] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1779.995915] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1779.995915] env[68617]: ERROR oslo_vmware.rw_handles [ 1779.996679] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/93d19a37-bb3a-4c4b-a665-2fb3c480306f/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1779.998553] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1779.998826] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Copying Virtual Disk [datastore2] vmware_temp/93d19a37-bb3a-4c4b-a665-2fb3c480306f/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/93d19a37-bb3a-4c4b-a665-2fb3c480306f/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1779.999155] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ead80aaa-f0fe-4e12-b0a1-3e7b7268ca9c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1780.007958] env[68617]: DEBUG oslo_vmware.api [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Waiting for the task: (returnval){ [ 1780.007958] env[68617]: value = "task-3470865" [ 1780.007958] env[68617]: _type = "Task" [ 1780.007958] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1780.016513] env[68617]: DEBUG oslo_vmware.api [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Task: {'id': task-3470865, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1780.519075] env[68617]: DEBUG oslo_vmware.exceptions [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1780.519075] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1780.519610] env[68617]: ERROR nova.compute.manager [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1780.519610] env[68617]: Faults: ['InvalidArgument'] [ 1780.519610] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Traceback (most recent call last): [ 1780.519610] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1780.519610] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] yield resources [ 1780.519610] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1780.519610] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] self.driver.spawn(context, instance, image_meta, [ 1780.519610] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1780.519610] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1780.519610] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1780.519610] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] self._fetch_image_if_missing(context, vi) [ 1780.519610] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1780.520031] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] image_cache(vi, tmp_image_ds_loc) [ 1780.520031] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1780.520031] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] vm_util.copy_virtual_disk( [ 1780.520031] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1780.520031] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] session._wait_for_task(vmdk_copy_task) [ 1780.520031] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1780.520031] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] return self.wait_for_task(task_ref) [ 1780.520031] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1780.520031] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] return evt.wait() [ 1780.520031] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1780.520031] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] result = hub.switch() [ 1780.520031] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1780.520031] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] return self.greenlet.switch() [ 1780.520521] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1780.520521] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] self.f(*self.args, **self.kw) [ 1780.520521] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1780.520521] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] raise exceptions.translate_fault(task_info.error) [ 1780.520521] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1780.520521] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Faults: ['InvalidArgument'] [ 1780.520521] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] [ 1780.520521] env[68617]: INFO nova.compute.manager [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Terminating instance [ 1780.521527] env[68617]: DEBUG oslo_concurrency.lockutils [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1780.522565] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1780.522565] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-04d728fd-9c75-4123-b705-33ee9a54ef25 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1780.524049] env[68617]: DEBUG nova.compute.manager [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1780.524253] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1780.525064] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-841823e6-83da-4f5e-81a0-7b3974cc6343 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1780.532029] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1780.532270] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-bd206e9f-0164-48ad-b614-8b64f962c060 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1780.534525] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1780.534712] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1780.536167] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4fd04cb0-f8ba-434b-b1df-962a62d6bc2b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1780.540690] env[68617]: DEBUG oslo_vmware.api [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Waiting for the task: (returnval){ [ 1780.540690] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]528fb87b-4727-efa8-7221-1a6ee6b3bedd" [ 1780.540690] env[68617]: _type = "Task" [ 1780.540690] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1780.548080] env[68617]: DEBUG oslo_vmware.api [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]528fb87b-4727-efa8-7221-1a6ee6b3bedd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1780.599637] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1780.599857] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1780.600045] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Deleting the datastore file [datastore2] f03b9bc5-9438-4c0c-b595-72c631bece08 {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1780.600314] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ec184f95-272f-4b34-ac8b-442b3b282fa2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1780.607033] env[68617]: DEBUG oslo_vmware.api [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Waiting for the task: (returnval){ [ 1780.607033] env[68617]: value = "task-3470867" [ 1780.607033] env[68617]: _type = "Task" [ 1780.607033] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1780.614602] env[68617]: DEBUG oslo_vmware.api [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Task: {'id': task-3470867, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1781.051419] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1781.051761] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Creating directory with path [datastore2] vmware_temp/64e6bb17-328c-4f6f-a98f-eeeb189b7a90/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1781.051853] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-579b67d5-0b5b-42ff-95b9-171680b5caa2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1781.062763] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Created directory with path [datastore2] vmware_temp/64e6bb17-328c-4f6f-a98f-eeeb189b7a90/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1781.062941] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Fetch image to [datastore2] vmware_temp/64e6bb17-328c-4f6f-a98f-eeeb189b7a90/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1781.063121] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/64e6bb17-328c-4f6f-a98f-eeeb189b7a90/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1781.063794] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7afe97ce-c695-4966-9e32-68a6d4aa9844 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1781.070021] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2f5a1a1-3393-4348-9654-bef86ed0bd76 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1781.078560] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1548306-e9b4-4077-8ced-e04c120c52c5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1781.112055] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3cdd0d4-7d92-4123-9576-3da4bb5ceaf5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1781.118656] env[68617]: DEBUG oslo_vmware.api [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Task: {'id': task-3470867, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079608} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1781.120096] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1781.120280] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1781.120450] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1781.120619] env[68617]: INFO nova.compute.manager [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1781.122311] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2afa8c0a-1ac8-43eb-8ebb-fd547758d635 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1781.124090] env[68617]: DEBUG nova.compute.claims [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1781.124265] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1781.124477] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1781.144557] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1781.195715] env[68617]: DEBUG oslo_vmware.rw_handles [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/64e6bb17-328c-4f6f-a98f-eeeb189b7a90/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1781.254293] env[68617]: DEBUG oslo_vmware.rw_handles [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1781.254395] env[68617]: DEBUG oslo_vmware.rw_handles [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/64e6bb17-328c-4f6f-a98f-eeeb189b7a90/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1781.381786] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3406af9-fa54-46cd-9531-bb83b785d731 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1781.389711] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e944fce6-2dca-4ce0-a92b-109a3f564e38 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1781.418271] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48cd4b2b-9985-4df7-9cb4-92ec5c666086 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1781.424721] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b092d528-94ef-47d2-813b-ac622267c93b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1781.437941] env[68617]: DEBUG nova.compute.provider_tree [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1781.446900] env[68617]: DEBUG nova.scheduler.client.report [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1781.460215] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.336s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1781.460738] env[68617]: ERROR nova.compute.manager [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1781.460738] env[68617]: Faults: ['InvalidArgument'] [ 1781.460738] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Traceback (most recent call last): [ 1781.460738] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1781.460738] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] self.driver.spawn(context, instance, image_meta, [ 1781.460738] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1781.460738] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1781.460738] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1781.460738] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] self._fetch_image_if_missing(context, vi) [ 1781.460738] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1781.460738] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] image_cache(vi, tmp_image_ds_loc) [ 1781.460738] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1781.461127] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] vm_util.copy_virtual_disk( [ 1781.461127] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1781.461127] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] session._wait_for_task(vmdk_copy_task) [ 1781.461127] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1781.461127] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] return self.wait_for_task(task_ref) [ 1781.461127] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1781.461127] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] return evt.wait() [ 1781.461127] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1781.461127] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] result = hub.switch() [ 1781.461127] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1781.461127] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] return self.greenlet.switch() [ 1781.461127] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1781.461127] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] self.f(*self.args, **self.kw) [ 1781.461529] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1781.461529] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] raise exceptions.translate_fault(task_info.error) [ 1781.461529] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1781.461529] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Faults: ['InvalidArgument'] [ 1781.461529] env[68617]: ERROR nova.compute.manager [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] [ 1781.461529] env[68617]: DEBUG nova.compute.utils [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1781.462730] env[68617]: DEBUG nova.compute.manager [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Build of instance f03b9bc5-9438-4c0c-b595-72c631bece08 was re-scheduled: A specified parameter was not correct: fileType [ 1781.462730] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1781.463117] env[68617]: DEBUG nova.compute.manager [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1781.463291] env[68617]: DEBUG nova.compute.manager [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1781.463457] env[68617]: DEBUG nova.compute.manager [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1781.463621] env[68617]: DEBUG nova.network.neutron [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1781.770923] env[68617]: DEBUG nova.network.neutron [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1781.785335] env[68617]: INFO nova.compute.manager [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Took 0.32 seconds to deallocate network for instance. [ 1781.887349] env[68617]: INFO nova.scheduler.client.report [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Deleted allocations for instance f03b9bc5-9438-4c0c-b595-72c631bece08 [ 1781.909764] env[68617]: DEBUG oslo_concurrency.lockutils [None req-3d936f67-b748-40b2-bb92-3fa502cae701 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Lock "f03b9bc5-9438-4c0c-b595-72c631bece08" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 585.809s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1781.911099] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0f4f2eab-6e03-451a-b883-309c60868205 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Lock "f03b9bc5-9438-4c0c-b595-72c631bece08" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 389.639s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1781.911327] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0f4f2eab-6e03-451a-b883-309c60868205 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Acquiring lock "f03b9bc5-9438-4c0c-b595-72c631bece08-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1781.911534] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0f4f2eab-6e03-451a-b883-309c60868205 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Lock "f03b9bc5-9438-4c0c-b595-72c631bece08-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1781.911696] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0f4f2eab-6e03-451a-b883-309c60868205 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Lock "f03b9bc5-9438-4c0c-b595-72c631bece08-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1781.913662] env[68617]: INFO nova.compute.manager [None req-0f4f2eab-6e03-451a-b883-309c60868205 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Terminating instance [ 1781.915554] env[68617]: DEBUG nova.compute.manager [None req-0f4f2eab-6e03-451a-b883-309c60868205 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1781.915653] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0f4f2eab-6e03-451a-b883-309c60868205 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1781.916160] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2b6ee075-302d-47c5-a7d1-ea02ef698e9d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1781.925769] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-425b727d-079b-4e60-9659-21f2f6322f52 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1781.937211] env[68617]: DEBUG nova.compute.manager [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1781.958069] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-0f4f2eab-6e03-451a-b883-309c60868205 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f03b9bc5-9438-4c0c-b595-72c631bece08 could not be found. [ 1781.958292] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0f4f2eab-6e03-451a-b883-309c60868205 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1781.959027] env[68617]: INFO nova.compute.manager [None req-0f4f2eab-6e03-451a-b883-309c60868205 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1781.959027] env[68617]: DEBUG oslo.service.loopingcall [None req-0f4f2eab-6e03-451a-b883-309c60868205 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1781.959027] env[68617]: DEBUG nova.compute.manager [-] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1781.959027] env[68617]: DEBUG nova.network.neutron [-] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1781.982494] env[68617]: DEBUG nova.network.neutron [-] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1781.989748] env[68617]: INFO nova.compute.manager [-] [instance: f03b9bc5-9438-4c0c-b595-72c631bece08] Took 0.03 seconds to deallocate network for instance. [ 1781.992151] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1781.992387] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1781.993858] env[68617]: INFO nova.compute.claims [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1782.079133] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0f4f2eab-6e03-451a-b883-309c60868205 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Lock "f03b9bc5-9438-4c0c-b595-72c631bece08" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.168s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1782.193823] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ca226b6-6ae2-4732-b173-985a16c3b089 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1782.201636] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-045c8269-763e-4756-a1e2-be7e2ffc454f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1782.230897] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79286ad7-2735-4019-b654-be39ad21a516 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1782.237488] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34ba7dc7-5ffe-4132-8b19-12338a691456 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1782.249978] env[68617]: DEBUG nova.compute.provider_tree [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1782.258358] env[68617]: DEBUG nova.scheduler.client.report [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1782.271281] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.279s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1782.271716] env[68617]: DEBUG nova.compute.manager [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1782.304613] env[68617]: DEBUG nova.compute.utils [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1782.306039] env[68617]: DEBUG nova.compute.manager [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1782.306141] env[68617]: DEBUG nova.network.neutron [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1782.313884] env[68617]: DEBUG nova.compute.manager [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1782.365233] env[68617]: DEBUG nova.policy [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3e3e6fa7da72463faa4f9568ff97776a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c07119c006e84a66bf7a37c1920f3694', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 1782.379707] env[68617]: DEBUG nova.compute.manager [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1782.404121] env[68617]: DEBUG nova.virt.hardware [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1782.404381] env[68617]: DEBUG nova.virt.hardware [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1782.404537] env[68617]: DEBUG nova.virt.hardware [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1782.404772] env[68617]: DEBUG nova.virt.hardware [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1782.404890] env[68617]: DEBUG nova.virt.hardware [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1782.406040] env[68617]: DEBUG nova.virt.hardware [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1782.406040] env[68617]: DEBUG nova.virt.hardware [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1782.406040] env[68617]: DEBUG nova.virt.hardware [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1782.406040] env[68617]: DEBUG nova.virt.hardware [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1782.406040] env[68617]: DEBUG nova.virt.hardware [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1782.406442] env[68617]: DEBUG nova.virt.hardware [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1782.407011] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd4f88c7-bf00-41f0-8007-b4c65b5c96c7 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1782.417207] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a079b743-ffa6-407d-8928-b92fffe70477 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1782.881522] env[68617]: DEBUG nova.network.neutron [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Successfully created port: 31162b00-a40f-44e3-a949-7b0cf92da041 {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1783.434885] env[68617]: DEBUG nova.network.neutron [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Successfully updated port: 31162b00-a40f-44e3-a949-7b0cf92da041 {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1783.444545] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Acquiring lock "refresh_cache-922c8926-c636-4463-85d6-4f2a6325b85a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1783.444714] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Acquired lock "refresh_cache-922c8926-c636-4463-85d6-4f2a6325b85a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1783.444880] env[68617]: DEBUG nova.network.neutron [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1783.487430] env[68617]: DEBUG nova.network.neutron [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1783.641425] env[68617]: DEBUG nova.network.neutron [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Updating instance_info_cache with network_info: [{"id": "31162b00-a40f-44e3-a949-7b0cf92da041", "address": "fa:16:3e:9f:66:82", "network": {"id": "65bec07e-2fec-40ce-ac24-d75d61493fed", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-174756301-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c07119c006e84a66bf7a37c1920f3694", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "79ece966-6187-47d7-bce7-cc39df14ac67", "external-id": "nsx-vlan-transportzone-472", "segmentation_id": 472, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap31162b00-a4", "ovs_interfaceid": "31162b00-a40f-44e3-a949-7b0cf92da041", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1783.654239] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Releasing lock "refresh_cache-922c8926-c636-4463-85d6-4f2a6325b85a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1783.654551] env[68617]: DEBUG nova.compute.manager [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Instance network_info: |[{"id": "31162b00-a40f-44e3-a949-7b0cf92da041", "address": "fa:16:3e:9f:66:82", "network": {"id": "65bec07e-2fec-40ce-ac24-d75d61493fed", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-174756301-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c07119c006e84a66bf7a37c1920f3694", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "79ece966-6187-47d7-bce7-cc39df14ac67", "external-id": "nsx-vlan-transportzone-472", "segmentation_id": 472, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap31162b00-a4", "ovs_interfaceid": "31162b00-a40f-44e3-a949-7b0cf92da041", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1783.654959] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9f:66:82', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '79ece966-6187-47d7-bce7-cc39df14ac67', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '31162b00-a40f-44e3-a949-7b0cf92da041', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1783.662599] env[68617]: DEBUG oslo.service.loopingcall [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1783.663127] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1783.663361] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-beb0e795-3573-45c2-9cbc-269c83cd8353 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1783.684350] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1783.684350] env[68617]: value = "task-3470868" [ 1783.684350] env[68617]: _type = "Task" [ 1783.684350] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1783.692827] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470868, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1783.806374] env[68617]: DEBUG nova.compute.manager [req-27195ee6-c815-48b9-a88b-282137ad016d req-bcf81f40-b257-4a1e-a429-99238e38d285 service nova] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Received event network-vif-plugged-31162b00-a40f-44e3-a949-7b0cf92da041 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1783.806429] env[68617]: DEBUG oslo_concurrency.lockutils [req-27195ee6-c815-48b9-a88b-282137ad016d req-bcf81f40-b257-4a1e-a429-99238e38d285 service nova] Acquiring lock "922c8926-c636-4463-85d6-4f2a6325b85a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1783.806648] env[68617]: DEBUG oslo_concurrency.lockutils [req-27195ee6-c815-48b9-a88b-282137ad016d req-bcf81f40-b257-4a1e-a429-99238e38d285 service nova] Lock "922c8926-c636-4463-85d6-4f2a6325b85a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1783.806890] env[68617]: DEBUG oslo_concurrency.lockutils [req-27195ee6-c815-48b9-a88b-282137ad016d req-bcf81f40-b257-4a1e-a429-99238e38d285 service nova] Lock "922c8926-c636-4463-85d6-4f2a6325b85a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1783.806990] env[68617]: DEBUG nova.compute.manager [req-27195ee6-c815-48b9-a88b-282137ad016d req-bcf81f40-b257-4a1e-a429-99238e38d285 service nova] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] No waiting events found dispatching network-vif-plugged-31162b00-a40f-44e3-a949-7b0cf92da041 {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1783.807177] env[68617]: WARNING nova.compute.manager [req-27195ee6-c815-48b9-a88b-282137ad016d req-bcf81f40-b257-4a1e-a429-99238e38d285 service nova] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Received unexpected event network-vif-plugged-31162b00-a40f-44e3-a949-7b0cf92da041 for instance with vm_state building and task_state spawning. [ 1783.807334] env[68617]: DEBUG nova.compute.manager [req-27195ee6-c815-48b9-a88b-282137ad016d req-bcf81f40-b257-4a1e-a429-99238e38d285 service nova] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Received event network-changed-31162b00-a40f-44e3-a949-7b0cf92da041 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1783.807485] env[68617]: DEBUG nova.compute.manager [req-27195ee6-c815-48b9-a88b-282137ad016d req-bcf81f40-b257-4a1e-a429-99238e38d285 service nova] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Refreshing instance network info cache due to event network-changed-31162b00-a40f-44e3-a949-7b0cf92da041. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1783.807675] env[68617]: DEBUG oslo_concurrency.lockutils [req-27195ee6-c815-48b9-a88b-282137ad016d req-bcf81f40-b257-4a1e-a429-99238e38d285 service nova] Acquiring lock "refresh_cache-922c8926-c636-4463-85d6-4f2a6325b85a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1783.807801] env[68617]: DEBUG oslo_concurrency.lockutils [req-27195ee6-c815-48b9-a88b-282137ad016d req-bcf81f40-b257-4a1e-a429-99238e38d285 service nova] Acquired lock "refresh_cache-922c8926-c636-4463-85d6-4f2a6325b85a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1783.807954] env[68617]: DEBUG nova.network.neutron [req-27195ee6-c815-48b9-a88b-282137ad016d req-bcf81f40-b257-4a1e-a429-99238e38d285 service nova] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Refreshing network info cache for port 31162b00-a40f-44e3-a949-7b0cf92da041 {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1784.072917] env[68617]: DEBUG nova.network.neutron [req-27195ee6-c815-48b9-a88b-282137ad016d req-bcf81f40-b257-4a1e-a429-99238e38d285 service nova] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Updated VIF entry in instance network info cache for port 31162b00-a40f-44e3-a949-7b0cf92da041. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1784.073285] env[68617]: DEBUG nova.network.neutron [req-27195ee6-c815-48b9-a88b-282137ad016d req-bcf81f40-b257-4a1e-a429-99238e38d285 service nova] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Updating instance_info_cache with network_info: [{"id": "31162b00-a40f-44e3-a949-7b0cf92da041", "address": "fa:16:3e:9f:66:82", "network": {"id": "65bec07e-2fec-40ce-ac24-d75d61493fed", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-174756301-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c07119c006e84a66bf7a37c1920f3694", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "79ece966-6187-47d7-bce7-cc39df14ac67", "external-id": "nsx-vlan-transportzone-472", "segmentation_id": 472, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap31162b00-a4", "ovs_interfaceid": "31162b00-a40f-44e3-a949-7b0cf92da041", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1784.082337] env[68617]: DEBUG oslo_concurrency.lockutils [req-27195ee6-c815-48b9-a88b-282137ad016d req-bcf81f40-b257-4a1e-a429-99238e38d285 service nova] Releasing lock "refresh_cache-922c8926-c636-4463-85d6-4f2a6325b85a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1784.193797] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470868, 'name': CreateVM_Task, 'duration_secs': 0.318487} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1784.193978] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1784.194732] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1784.194903] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1784.195255] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1784.195495] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5d82a10c-c9c3-4d93-9fee-c415e1d320af {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1784.199742] env[68617]: DEBUG oslo_vmware.api [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Waiting for the task: (returnval){ [ 1784.199742] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]5281f1a4-56ef-ce58-7bd1-815b757f1ba7" [ 1784.199742] env[68617]: _type = "Task" [ 1784.199742] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1784.207308] env[68617]: DEBUG oslo_vmware.api [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]5281f1a4-56ef-ce58-7bd1-815b757f1ba7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1784.710506] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1784.710928] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1784.710928] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1785.398536] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d257f5f5-2eb5-4c3e-b15c-007d64111732 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Acquiring lock "922c8926-c636-4463-85d6-4f2a6325b85a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1791.699465] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1791.699873] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Starting heal instance info cache {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1791.699873] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Rebuilding the list of instances to heal {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1791.723793] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1791.723982] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1791.724087] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1791.724216] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1791.724337] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1791.724461] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1791.724576] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1791.724703] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1791.724880] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1791.725051] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1791.725220] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Didn't find any instances for network info cache update. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1796.698690] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1796.698994] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1796.699146] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1797.699593] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1797.699855] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1797.700045] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68617) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1800.700086] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1800.711143] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1800.711358] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1800.711524] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1800.711672] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1800.712738] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-269ecbc4-d990-4917-8ea8-59deaa5f0fae {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1800.721435] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd42f311-f723-4512-a4c1-f82e23ed6f9b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1800.734938] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67821f8f-5d86-4b38-9670-44add6b1af9b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1800.740970] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-390641c4-ae97-4ab7-a64e-08ebc54cead2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1800.769107] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180915MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1800.769245] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1800.769426] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1800.841159] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance ee6efd93-25be-4268-afe9-ba39e543a4fb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1800.841331] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 1605028f-5d6d-4ac4-8416-c0465982c53a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1800.841457] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance fc1043b8-535d-4af0-b92b-1f43580cdc9a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1800.841578] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1800.841694] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f54002b0-d60e-44ff-82a5-ef2f5193c48c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1800.841830] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 2c950cba-7698-48e0-8852-bf569f58f967 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1800.842057] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 12ed2a40-3d74-49a2-95b4-ccaaf58c8060 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1800.842225] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 21d0560a-fde3-4c16-b2fc-06d6f8668a7a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1800.842353] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 902b5ab9-23b8-450f-853a-b2da889c3afd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1800.842468] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 922c8926-c636-4463-85d6-4f2a6325b85a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1800.852857] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b1a8dc60-af98-4f80-96cf-b2550ea8c13a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1800.862530] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance a4ab788d-327a-47cc-8ae7-e1b9be889759 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1800.871945] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance fe0d64a6-6ce6-4ef5-8ae1-a160c5ec0987 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1800.872174] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1800.872316] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1801.013429] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-277652bd-2f0e-4aa0-ba3a-8980357c8690 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.020682] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-286a2850-a6d3-43b5-83b9-5c96c8024a2a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.050500] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b5dc08a-9233-4510-983d-f4330f6ab468 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.057754] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-651c972b-01d8-4c02-b8d1-67c43025c122 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.070656] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1801.078910] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1801.094817] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1801.095038] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.326s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1803.090016] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1803.090391] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1827.519057] env[68617]: WARNING oslo_vmware.rw_handles [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1827.519057] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1827.519057] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1827.519057] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1827.519057] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1827.519057] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 1827.519057] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1827.519057] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1827.519057] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1827.519057] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1827.519057] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1827.519057] env[68617]: ERROR oslo_vmware.rw_handles [ 1827.519057] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/64e6bb17-328c-4f6f-a98f-eeeb189b7a90/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1827.521093] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1827.521389] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Copying Virtual Disk [datastore2] vmware_temp/64e6bb17-328c-4f6f-a98f-eeeb189b7a90/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/64e6bb17-328c-4f6f-a98f-eeeb189b7a90/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1827.521718] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-b04a0ae9-1ea1-4e11-ba9e-dcfd9fc17ae1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1827.530057] env[68617]: DEBUG oslo_vmware.api [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Waiting for the task: (returnval){ [ 1827.530057] env[68617]: value = "task-3470869" [ 1827.530057] env[68617]: _type = "Task" [ 1827.530057] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1827.537549] env[68617]: DEBUG oslo_vmware.api [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Task: {'id': task-3470869, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1828.040712] env[68617]: DEBUG oslo_vmware.exceptions [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1828.041017] env[68617]: DEBUG oslo_concurrency.lockutils [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1828.041611] env[68617]: ERROR nova.compute.manager [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1828.041611] env[68617]: Faults: ['InvalidArgument'] [ 1828.041611] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Traceback (most recent call last): [ 1828.041611] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1828.041611] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] yield resources [ 1828.041611] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1828.041611] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] self.driver.spawn(context, instance, image_meta, [ 1828.041611] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1828.041611] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1828.041611] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1828.041611] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] self._fetch_image_if_missing(context, vi) [ 1828.041611] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1828.042219] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] image_cache(vi, tmp_image_ds_loc) [ 1828.042219] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1828.042219] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] vm_util.copy_virtual_disk( [ 1828.042219] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1828.042219] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] session._wait_for_task(vmdk_copy_task) [ 1828.042219] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1828.042219] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] return self.wait_for_task(task_ref) [ 1828.042219] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1828.042219] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] return evt.wait() [ 1828.042219] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1828.042219] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] result = hub.switch() [ 1828.042219] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1828.042219] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] return self.greenlet.switch() [ 1828.042625] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1828.042625] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] self.f(*self.args, **self.kw) [ 1828.042625] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1828.042625] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] raise exceptions.translate_fault(task_info.error) [ 1828.042625] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1828.042625] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Faults: ['InvalidArgument'] [ 1828.042625] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] [ 1828.042625] env[68617]: INFO nova.compute.manager [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Terminating instance [ 1828.044424] env[68617]: DEBUG oslo_concurrency.lockutils [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1828.044666] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1828.045348] env[68617]: DEBUG nova.compute.manager [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1828.045538] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1828.045784] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4bdd735b-185f-4f4c-9a19-6014ff3a84fc {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1828.048157] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e606d2a-ce71-44ed-a83f-e561d7e15100 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1828.056683] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1828.056914] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0192a1a6-b23c-43cb-bbc7-4d3bb62c94e1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1828.059141] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1828.059314] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1828.060244] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fb395ecb-64c8-4376-b835-1bd5661ab23e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1828.066028] env[68617]: DEBUG oslo_vmware.api [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Waiting for the task: (returnval){ [ 1828.066028] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]528da23b-1d0c-0160-71a2-10aa9472e512" [ 1828.066028] env[68617]: _type = "Task" [ 1828.066028] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1828.080180] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1828.080430] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Creating directory with path [datastore2] vmware_temp/1648eb3c-47e1-4674-ae1a-c603d7fe4950/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1828.080646] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-48db0264-e9d2-484e-be2a-f8618f029a34 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1828.100479] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Created directory with path [datastore2] vmware_temp/1648eb3c-47e1-4674-ae1a-c603d7fe4950/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1828.100690] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Fetch image to [datastore2] vmware_temp/1648eb3c-47e1-4674-ae1a-c603d7fe4950/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1828.100862] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/1648eb3c-47e1-4674-ae1a-c603d7fe4950/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1828.101669] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-922d3c29-e43d-420c-bc20-4ae8c104558b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1828.109217] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d2372d8-cc61-423a-8c42-4dfd70ac1d21 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1828.120673] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6cacf2ca-c698-494e-b56b-77c0aa9a5890 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1828.124593] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1828.124790] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1828.124959] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Deleting the datastore file [datastore2] ee6efd93-25be-4268-afe9-ba39e543a4fb {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1828.125216] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-439f6443-c763-4d39-abea-a07b55b62555 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1828.130824] env[68617]: DEBUG oslo_vmware.api [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Waiting for the task: (returnval){ [ 1828.130824] env[68617]: value = "task-3470871" [ 1828.130824] env[68617]: _type = "Task" [ 1828.130824] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1828.160028] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94c3d59f-363e-43da-b714-d425fbae5bc7 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1828.167020] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a48cafa7-83c4-414f-8919-f60ff7461d79 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1828.193256] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1828.246223] env[68617]: DEBUG oslo_vmware.rw_handles [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1648eb3c-47e1-4674-ae1a-c603d7fe4950/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1828.305364] env[68617]: DEBUG oslo_vmware.rw_handles [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1828.305600] env[68617]: DEBUG oslo_vmware.rw_handles [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1648eb3c-47e1-4674-ae1a-c603d7fe4950/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1828.640990] env[68617]: DEBUG oslo_vmware.api [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Task: {'id': task-3470871, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.096315} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1828.641427] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1828.641471] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1828.641610] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1828.641785] env[68617]: INFO nova.compute.manager [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1828.643888] env[68617]: DEBUG nova.compute.claims [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1828.644074] env[68617]: DEBUG oslo_concurrency.lockutils [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1828.644331] env[68617]: DEBUG oslo_concurrency.lockutils [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1828.844022] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b0bce36-c546-4992-89ce-7b5359b9e1ff {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1828.852477] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da4ab517-8be2-4342-984b-65686a424b5a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1828.894437] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-564879d6-1aee-4f16-be4c-f20ded5bcd1d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1828.901541] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8cb5d0fa-4f7d-48e3-ab1a-f8b32b512ea3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1828.914019] env[68617]: DEBUG nova.compute.provider_tree [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1828.922641] env[68617]: DEBUG nova.scheduler.client.report [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1828.935393] env[68617]: DEBUG oslo_concurrency.lockutils [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.291s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1828.935919] env[68617]: ERROR nova.compute.manager [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1828.935919] env[68617]: Faults: ['InvalidArgument'] [ 1828.935919] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Traceback (most recent call last): [ 1828.935919] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1828.935919] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] self.driver.spawn(context, instance, image_meta, [ 1828.935919] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1828.935919] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1828.935919] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1828.935919] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] self._fetch_image_if_missing(context, vi) [ 1828.935919] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1828.935919] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] image_cache(vi, tmp_image_ds_loc) [ 1828.935919] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1828.936296] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] vm_util.copy_virtual_disk( [ 1828.936296] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1828.936296] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] session._wait_for_task(vmdk_copy_task) [ 1828.936296] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1828.936296] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] return self.wait_for_task(task_ref) [ 1828.936296] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1828.936296] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] return evt.wait() [ 1828.936296] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1828.936296] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] result = hub.switch() [ 1828.936296] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1828.936296] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] return self.greenlet.switch() [ 1828.936296] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1828.936296] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] self.f(*self.args, **self.kw) [ 1828.936635] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1828.936635] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] raise exceptions.translate_fault(task_info.error) [ 1828.936635] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1828.936635] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Faults: ['InvalidArgument'] [ 1828.936635] env[68617]: ERROR nova.compute.manager [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] [ 1828.936635] env[68617]: DEBUG nova.compute.utils [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1828.938377] env[68617]: DEBUG nova.compute.manager [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Build of instance ee6efd93-25be-4268-afe9-ba39e543a4fb was re-scheduled: A specified parameter was not correct: fileType [ 1828.938377] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1828.938776] env[68617]: DEBUG nova.compute.manager [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1828.938951] env[68617]: DEBUG nova.compute.manager [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1828.939155] env[68617]: DEBUG nova.compute.manager [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1828.939327] env[68617]: DEBUG nova.network.neutron [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1829.227827] env[68617]: DEBUG nova.network.neutron [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1829.239042] env[68617]: INFO nova.compute.manager [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Took 0.30 seconds to deallocate network for instance. [ 1829.347051] env[68617]: INFO nova.scheduler.client.report [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Deleted allocations for instance ee6efd93-25be-4268-afe9-ba39e543a4fb [ 1829.370461] env[68617]: DEBUG oslo_concurrency.lockutils [None req-09a51d05-a70c-46b4-9494-5d7dc38632ef tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Lock "ee6efd93-25be-4268-afe9-ba39e543a4fb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 529.671s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1829.375021] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e6e8305f-8d9a-4bf2-a290-8f8b5922463c tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Lock "ee6efd93-25be-4268-afe9-ba39e543a4fb" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 333.639s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1829.375021] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e6e8305f-8d9a-4bf2-a290-8f8b5922463c tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Acquiring lock "ee6efd93-25be-4268-afe9-ba39e543a4fb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1829.375021] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e6e8305f-8d9a-4bf2-a290-8f8b5922463c tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Lock "ee6efd93-25be-4268-afe9-ba39e543a4fb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1829.375318] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e6e8305f-8d9a-4bf2-a290-8f8b5922463c tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Lock "ee6efd93-25be-4268-afe9-ba39e543a4fb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1829.375318] env[68617]: INFO nova.compute.manager [None req-e6e8305f-8d9a-4bf2-a290-8f8b5922463c tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Terminating instance [ 1829.377832] env[68617]: DEBUG nova.compute.manager [None req-e6e8305f-8d9a-4bf2-a290-8f8b5922463c tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1829.377832] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-e6e8305f-8d9a-4bf2-a290-8f8b5922463c tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1829.377832] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-30b857e0-75c7-45a7-abb1-5439024cf305 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1829.386638] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4227c775-c9eb-4b13-8b83-fa92718dbf85 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1829.398540] env[68617]: DEBUG nova.compute.manager [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1829.421590] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-e6e8305f-8d9a-4bf2-a290-8f8b5922463c tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ee6efd93-25be-4268-afe9-ba39e543a4fb could not be found. [ 1829.421991] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-e6e8305f-8d9a-4bf2-a290-8f8b5922463c tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1829.422257] env[68617]: INFO nova.compute.manager [None req-e6e8305f-8d9a-4bf2-a290-8f8b5922463c tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1829.422557] env[68617]: DEBUG oslo.service.loopingcall [None req-e6e8305f-8d9a-4bf2-a290-8f8b5922463c tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1829.422834] env[68617]: DEBUG nova.compute.manager [-] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1829.422977] env[68617]: DEBUG nova.network.neutron [-] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1829.445482] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1829.445733] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1829.447273] env[68617]: INFO nova.compute.claims [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1829.450781] env[68617]: DEBUG nova.network.neutron [-] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1829.467212] env[68617]: INFO nova.compute.manager [-] [instance: ee6efd93-25be-4268-afe9-ba39e543a4fb] Took 0.04 seconds to deallocate network for instance. [ 1829.570550] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e6e8305f-8d9a-4bf2-a290-8f8b5922463c tempest-AttachInterfacesUnderV243Test-1023400965 tempest-AttachInterfacesUnderV243Test-1023400965-project-member] Lock "ee6efd93-25be-4268-afe9-ba39e543a4fb" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.199s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1829.655467] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b34a8494-910d-47f0-bec9-227abcb71dc8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1829.663143] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9b9e7fe-be76-4236-acad-b41abcf47df3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1829.693176] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03e2250c-04af-4c9d-8dc9-6aa81a3eeda1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1829.700233] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d4bd105-4fef-4816-b8be-97c765ffead2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1829.715160] env[68617]: DEBUG nova.compute.provider_tree [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1829.725069] env[68617]: DEBUG nova.scheduler.client.report [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1829.738270] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.292s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1829.738734] env[68617]: DEBUG nova.compute.manager [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1829.777532] env[68617]: DEBUG nova.compute.utils [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1829.778739] env[68617]: DEBUG nova.compute.manager [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1829.778902] env[68617]: DEBUG nova.network.neutron [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1829.788409] env[68617]: DEBUG nova.compute.manager [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1829.842873] env[68617]: DEBUG nova.policy [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'be1fb3906fa449949fc0b5eae9cab9fb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1e11c4e5c25a42119594647403c0199b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 1829.851660] env[68617]: DEBUG nova.compute.manager [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1829.876188] env[68617]: DEBUG nova.virt.hardware [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1829.876426] env[68617]: DEBUG nova.virt.hardware [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1829.876580] env[68617]: DEBUG nova.virt.hardware [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1829.876757] env[68617]: DEBUG nova.virt.hardware [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1829.877188] env[68617]: DEBUG nova.virt.hardware [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1829.877188] env[68617]: DEBUG nova.virt.hardware [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1829.877321] env[68617]: DEBUG nova.virt.hardware [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1829.877455] env[68617]: DEBUG nova.virt.hardware [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1829.877623] env[68617]: DEBUG nova.virt.hardware [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1829.877783] env[68617]: DEBUG nova.virt.hardware [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1829.877948] env[68617]: DEBUG nova.virt.hardware [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1829.878809] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13bcb863-9a4b-4a1f-baaa-258c50a66aea {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1829.886625] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a206d32-47af-4463-a9ab-aef6ce8c1096 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1830.135291] env[68617]: DEBUG nova.network.neutron [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Successfully created port: 037ff17d-c08f-4bfe-8bc2-64285367ebac {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1830.680697] env[68617]: DEBUG nova.network.neutron [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Successfully updated port: 037ff17d-c08f-4bfe-8bc2-64285367ebac {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1830.691556] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "refresh_cache-b1a8dc60-af98-4f80-96cf-b2550ea8c13a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1830.692010] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquired lock "refresh_cache-b1a8dc60-af98-4f80-96cf-b2550ea8c13a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1830.692242] env[68617]: DEBUG nova.network.neutron [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1830.735151] env[68617]: DEBUG nova.network.neutron [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1830.905831] env[68617]: DEBUG nova.network.neutron [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Updating instance_info_cache with network_info: [{"id": "037ff17d-c08f-4bfe-8bc2-64285367ebac", "address": "fa:16:3e:6b:1a:a2", "network": {"id": "1d9c32bb-1c81-4af6-8d3f-365a52df11cd", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-313904480-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1e11c4e5c25a42119594647403c0199b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d62c1cf-f39a-4626-9552-f1e13c692636", "external-id": "nsx-vlan-transportzone-748", "segmentation_id": 748, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap037ff17d-c0", "ovs_interfaceid": "037ff17d-c08f-4bfe-8bc2-64285367ebac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1830.919124] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Releasing lock "refresh_cache-b1a8dc60-af98-4f80-96cf-b2550ea8c13a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1830.919466] env[68617]: DEBUG nova.compute.manager [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Instance network_info: |[{"id": "037ff17d-c08f-4bfe-8bc2-64285367ebac", "address": "fa:16:3e:6b:1a:a2", "network": {"id": "1d9c32bb-1c81-4af6-8d3f-365a52df11cd", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-313904480-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1e11c4e5c25a42119594647403c0199b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d62c1cf-f39a-4626-9552-f1e13c692636", "external-id": "nsx-vlan-transportzone-748", "segmentation_id": 748, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap037ff17d-c0", "ovs_interfaceid": "037ff17d-c08f-4bfe-8bc2-64285367ebac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1830.919892] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6b:1a:a2', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6d62c1cf-f39a-4626-9552-f1e13c692636', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '037ff17d-c08f-4bfe-8bc2-64285367ebac', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1830.928450] env[68617]: DEBUG oslo.service.loopingcall [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1830.928998] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1830.929253] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e01d3287-e24f-47a7-b002-5085f7e29e7a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1830.949135] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1830.949135] env[68617]: value = "task-3470872" [ 1830.949135] env[68617]: _type = "Task" [ 1830.949135] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1830.957574] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470872, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1831.459718] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470872, 'name': CreateVM_Task, 'duration_secs': 0.274935} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1831.459929] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1831.460564] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1831.460733] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1831.461068] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1831.461335] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-791d0825-9fa3-442b-ade2-0cc4a25c04bf {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1831.465578] env[68617]: DEBUG oslo_vmware.api [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Waiting for the task: (returnval){ [ 1831.465578] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52a6f072-f30e-c03c-1504-d6004a97d1d4" [ 1831.465578] env[68617]: _type = "Task" [ 1831.465578] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1831.480321] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1831.480551] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1831.480756] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1831.486673] env[68617]: DEBUG nova.compute.manager [req-e52ed1b8-1aa1-476d-b3ce-40e4e2ccb2f8 req-ce68f424-dad1-4ba3-9f0e-4ee9f42f2679 service nova] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Received event network-vif-plugged-037ff17d-c08f-4bfe-8bc2-64285367ebac {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1831.486840] env[68617]: DEBUG oslo_concurrency.lockutils [req-e52ed1b8-1aa1-476d-b3ce-40e4e2ccb2f8 req-ce68f424-dad1-4ba3-9f0e-4ee9f42f2679 service nova] Acquiring lock "b1a8dc60-af98-4f80-96cf-b2550ea8c13a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1831.487114] env[68617]: DEBUG oslo_concurrency.lockutils [req-e52ed1b8-1aa1-476d-b3ce-40e4e2ccb2f8 req-ce68f424-dad1-4ba3-9f0e-4ee9f42f2679 service nova] Lock "b1a8dc60-af98-4f80-96cf-b2550ea8c13a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1831.487262] env[68617]: DEBUG oslo_concurrency.lockutils [req-e52ed1b8-1aa1-476d-b3ce-40e4e2ccb2f8 req-ce68f424-dad1-4ba3-9f0e-4ee9f42f2679 service nova] Lock "b1a8dc60-af98-4f80-96cf-b2550ea8c13a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1831.487426] env[68617]: DEBUG nova.compute.manager [req-e52ed1b8-1aa1-476d-b3ce-40e4e2ccb2f8 req-ce68f424-dad1-4ba3-9f0e-4ee9f42f2679 service nova] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] No waiting events found dispatching network-vif-plugged-037ff17d-c08f-4bfe-8bc2-64285367ebac {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1831.487593] env[68617]: WARNING nova.compute.manager [req-e52ed1b8-1aa1-476d-b3ce-40e4e2ccb2f8 req-ce68f424-dad1-4ba3-9f0e-4ee9f42f2679 service nova] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Received unexpected event network-vif-plugged-037ff17d-c08f-4bfe-8bc2-64285367ebac for instance with vm_state building and task_state spawning. [ 1831.487745] env[68617]: DEBUG nova.compute.manager [req-e52ed1b8-1aa1-476d-b3ce-40e4e2ccb2f8 req-ce68f424-dad1-4ba3-9f0e-4ee9f42f2679 service nova] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Received event network-changed-037ff17d-c08f-4bfe-8bc2-64285367ebac {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1831.487893] env[68617]: DEBUG nova.compute.manager [req-e52ed1b8-1aa1-476d-b3ce-40e4e2ccb2f8 req-ce68f424-dad1-4ba3-9f0e-4ee9f42f2679 service nova] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Refreshing instance network info cache due to event network-changed-037ff17d-c08f-4bfe-8bc2-64285367ebac. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1831.488090] env[68617]: DEBUG oslo_concurrency.lockutils [req-e52ed1b8-1aa1-476d-b3ce-40e4e2ccb2f8 req-ce68f424-dad1-4ba3-9f0e-4ee9f42f2679 service nova] Acquiring lock "refresh_cache-b1a8dc60-af98-4f80-96cf-b2550ea8c13a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1831.488263] env[68617]: DEBUG oslo_concurrency.lockutils [req-e52ed1b8-1aa1-476d-b3ce-40e4e2ccb2f8 req-ce68f424-dad1-4ba3-9f0e-4ee9f42f2679 service nova] Acquired lock "refresh_cache-b1a8dc60-af98-4f80-96cf-b2550ea8c13a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1831.488438] env[68617]: DEBUG nova.network.neutron [req-e52ed1b8-1aa1-476d-b3ce-40e4e2ccb2f8 req-ce68f424-dad1-4ba3-9f0e-4ee9f42f2679 service nova] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Refreshing network info cache for port 037ff17d-c08f-4bfe-8bc2-64285367ebac {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1831.732532] env[68617]: DEBUG nova.network.neutron [req-e52ed1b8-1aa1-476d-b3ce-40e4e2ccb2f8 req-ce68f424-dad1-4ba3-9f0e-4ee9f42f2679 service nova] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Updated VIF entry in instance network info cache for port 037ff17d-c08f-4bfe-8bc2-64285367ebac. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1831.732901] env[68617]: DEBUG nova.network.neutron [req-e52ed1b8-1aa1-476d-b3ce-40e4e2ccb2f8 req-ce68f424-dad1-4ba3-9f0e-4ee9f42f2679 service nova] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Updating instance_info_cache with network_info: [{"id": "037ff17d-c08f-4bfe-8bc2-64285367ebac", "address": "fa:16:3e:6b:1a:a2", "network": {"id": "1d9c32bb-1c81-4af6-8d3f-365a52df11cd", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-313904480-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1e11c4e5c25a42119594647403c0199b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d62c1cf-f39a-4626-9552-f1e13c692636", "external-id": "nsx-vlan-transportzone-748", "segmentation_id": 748, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap037ff17d-c0", "ovs_interfaceid": "037ff17d-c08f-4bfe-8bc2-64285367ebac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1831.742479] env[68617]: DEBUG oslo_concurrency.lockutils [req-e52ed1b8-1aa1-476d-b3ce-40e4e2ccb2f8 req-ce68f424-dad1-4ba3-9f0e-4ee9f42f2679 service nova] Releasing lock "refresh_cache-b1a8dc60-af98-4f80-96cf-b2550ea8c13a" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1851.698842] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1851.699199] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Cleaning up deleted instances with incomplete migration {{(pid=68617) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1852.710438] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1852.710438] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Starting heal instance info cache {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1852.710438] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Rebuilding the list of instances to heal {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1852.732904] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1852.733259] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1852.733541] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1852.734167] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1852.734167] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1852.735020] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1852.735020] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1852.735020] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1852.735020] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1852.735020] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1852.735425] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Didn't find any instances for network info cache update. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1853.700400] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1853.700400] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Cleaning up deleted instances {{(pid=68617) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1853.716398] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] There are 0 instances to clean {{(pid=68617) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1853.716398] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1856.718462] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1856.739359] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1857.698996] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1857.699194] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68617) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1858.700066] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1858.700437] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1859.699580] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1860.699416] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1860.711752] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1860.712220] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1860.712440] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1860.712605] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1860.713814] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80ac25ba-cf23-43c2-aeae-816556542e97 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1860.722922] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46bf28d7-8e6f-428d-b446-559caf317b70 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1860.737503] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ab0a0f4-74a3-4a85-a6b1-27f436c56563 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1860.744187] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7184b6d2-7496-4c36-ac8b-75027dfb5c11 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1860.775425] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180934MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1860.775591] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1860.775785] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1860.931698] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 1605028f-5d6d-4ac4-8416-c0465982c53a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1860.931895] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance fc1043b8-535d-4af0-b92b-1f43580cdc9a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1860.932040] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1860.932197] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f54002b0-d60e-44ff-82a5-ef2f5193c48c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1860.932327] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 2c950cba-7698-48e0-8852-bf569f58f967 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1860.932389] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 12ed2a40-3d74-49a2-95b4-ccaaf58c8060 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1860.932507] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 21d0560a-fde3-4c16-b2fc-06d6f8668a7a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1860.932674] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 902b5ab9-23b8-450f-853a-b2da889c3afd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1860.932724] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 922c8926-c636-4463-85d6-4f2a6325b85a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1860.932903] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b1a8dc60-af98-4f80-96cf-b2550ea8c13a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1860.947611] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance a4ab788d-327a-47cc-8ae7-e1b9be889759 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1860.960815] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance fe0d64a6-6ce6-4ef5-8ae1-a160c5ec0987 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1860.961169] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1860.961333] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1860.979429] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Refreshing inventories for resource provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1860.994483] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Updating ProviderTree inventory for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1860.994726] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Updating inventory in ProviderTree for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1861.006696] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Refreshing aggregate associations for resource provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f, aggregates: None {{(pid=68617) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1861.026413] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Refreshing trait associations for resource provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_VMDK {{(pid=68617) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1861.171587] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b001687a-b2d0-49bc-b9ba-f649a4257dcf {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.179435] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3688f6d0-b223-428e-b638-3eb88eb65155 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.211839] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5822917c-f849-4294-8241-58e928157095 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.219300] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a264987-e482-4bf1-8ebc-9cb9e003b705 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.232476] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1861.240618] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1861.254103] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1861.254262] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.478s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1865.248651] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1865.248917] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1877.529535] env[68617]: WARNING oslo_vmware.rw_handles [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1877.529535] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1877.529535] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1877.529535] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1877.529535] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1877.529535] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 1877.529535] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1877.529535] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1877.529535] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1877.529535] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1877.529535] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1877.529535] env[68617]: ERROR oslo_vmware.rw_handles [ 1877.530567] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/1648eb3c-47e1-4674-ae1a-c603d7fe4950/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1877.531791] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1877.532048] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Copying Virtual Disk [datastore2] vmware_temp/1648eb3c-47e1-4674-ae1a-c603d7fe4950/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/1648eb3c-47e1-4674-ae1a-c603d7fe4950/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1877.532380] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-2f56b507-ea01-49ba-b4e0-fdda8581a584 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1877.540143] env[68617]: DEBUG oslo_vmware.api [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Waiting for the task: (returnval){ [ 1877.540143] env[68617]: value = "task-3470873" [ 1877.540143] env[68617]: _type = "Task" [ 1877.540143] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1877.548286] env[68617]: DEBUG oslo_vmware.api [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Task: {'id': task-3470873, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1878.050355] env[68617]: DEBUG oslo_vmware.exceptions [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1878.050666] env[68617]: DEBUG oslo_concurrency.lockutils [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1878.051241] env[68617]: ERROR nova.compute.manager [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1878.051241] env[68617]: Faults: ['InvalidArgument'] [ 1878.051241] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Traceback (most recent call last): [ 1878.051241] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1878.051241] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] yield resources [ 1878.051241] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1878.051241] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] self.driver.spawn(context, instance, image_meta, [ 1878.051241] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1878.051241] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1878.051241] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1878.051241] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] self._fetch_image_if_missing(context, vi) [ 1878.051241] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1878.051656] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] image_cache(vi, tmp_image_ds_loc) [ 1878.051656] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1878.051656] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] vm_util.copy_virtual_disk( [ 1878.051656] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1878.051656] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] session._wait_for_task(vmdk_copy_task) [ 1878.051656] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1878.051656] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] return self.wait_for_task(task_ref) [ 1878.051656] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1878.051656] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] return evt.wait() [ 1878.051656] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1878.051656] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] result = hub.switch() [ 1878.051656] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1878.051656] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] return self.greenlet.switch() [ 1878.052083] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1878.052083] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] self.f(*self.args, **self.kw) [ 1878.052083] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1878.052083] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] raise exceptions.translate_fault(task_info.error) [ 1878.052083] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1878.052083] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Faults: ['InvalidArgument'] [ 1878.052083] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] [ 1878.052083] env[68617]: INFO nova.compute.manager [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Terminating instance [ 1878.053148] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1878.053337] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1878.053572] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e4130b0f-1dbb-44b3-a7e8-fc7b2e7fbf1c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1878.055993] env[68617]: DEBUG nova.compute.manager [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1878.056159] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1878.056880] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ed8d9d7-e269-445f-acee-2e5c0fa29c0c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1878.063429] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1878.063645] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-fac2db8f-aca2-4d37-a832-3baf67778bd1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1878.065860] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1878.066039] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1878.066970] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-07ea637d-4027-40ed-8c22-c8bfbe77b79a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1878.071758] env[68617]: DEBUG oslo_vmware.api [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Waiting for the task: (returnval){ [ 1878.071758] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]5224b28b-845b-65e8-7e9a-e19be6032f3f" [ 1878.071758] env[68617]: _type = "Task" [ 1878.071758] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1878.079827] env[68617]: DEBUG oslo_vmware.api [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]5224b28b-845b-65e8-7e9a-e19be6032f3f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1878.131204] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1878.131442] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1878.131627] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Deleting the datastore file [datastore2] 1605028f-5d6d-4ac4-8416-c0465982c53a {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1878.131890] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-af345ff3-7297-4e47-be41-b13b80aeab05 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1878.137467] env[68617]: DEBUG oslo_vmware.api [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Waiting for the task: (returnval){ [ 1878.137467] env[68617]: value = "task-3470875" [ 1878.137467] env[68617]: _type = "Task" [ 1878.137467] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1878.145595] env[68617]: DEBUG oslo_vmware.api [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Task: {'id': task-3470875, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1878.583149] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1878.583455] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Creating directory with path [datastore2] vmware_temp/3bc78724-87b5-4bc6-a5cd-4ab917212c2b/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1878.583663] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cf315624-a469-49d9-8d7b-1d5c68d15754 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1878.594271] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Created directory with path [datastore2] vmware_temp/3bc78724-87b5-4bc6-a5cd-4ab917212c2b/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1878.594454] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Fetch image to [datastore2] vmware_temp/3bc78724-87b5-4bc6-a5cd-4ab917212c2b/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1878.594617] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/3bc78724-87b5-4bc6-a5cd-4ab917212c2b/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1878.595370] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7412257b-afe4-4fff-91c3-0eceb3857515 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1878.601653] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06fb91de-9d31-41c8-8ba6-b58472a9367b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1878.610328] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7cee05c6-49f2-407d-bf65-8c9f48597957 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1878.644172] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0833128-80cb-42e6-8e74-b34f5b99a5f4 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1878.653071] env[68617]: DEBUG oslo_vmware.api [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Task: {'id': task-3470875, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074632} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1878.653071] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1ba49fc8-ff35-4824-8208-dd65662bad7f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1878.654401] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1878.654593] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1878.654769] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1878.654945] env[68617]: INFO nova.compute.manager [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1878.657038] env[68617]: DEBUG nova.compute.claims [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1878.657207] env[68617]: DEBUG oslo_concurrency.lockutils [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1878.657442] env[68617]: DEBUG oslo_concurrency.lockutils [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1878.682462] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1878.732185] env[68617]: DEBUG oslo_vmware.rw_handles [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3bc78724-87b5-4bc6-a5cd-4ab917212c2b/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1878.791059] env[68617]: DEBUG oslo_vmware.rw_handles [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1878.791253] env[68617]: DEBUG oslo_vmware.rw_handles [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3bc78724-87b5-4bc6-a5cd-4ab917212c2b/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1878.900789] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f137edfa-6707-42c9-8f3f-280f89b704c6 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1878.908301] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32c2bed8-c434-4780-a4d8-3964edea5633 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1878.937878] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2856d127-8e12-4cbf-923e-ceb7bf88332e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1878.945322] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1911caa4-7d86-46ec-b9d9-35f73432418c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1878.956776] env[68617]: DEBUG nova.compute.provider_tree [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1878.965416] env[68617]: DEBUG nova.scheduler.client.report [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1878.983223] env[68617]: DEBUG oslo_concurrency.lockutils [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.326s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1878.983819] env[68617]: ERROR nova.compute.manager [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1878.983819] env[68617]: Faults: ['InvalidArgument'] [ 1878.983819] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Traceback (most recent call last): [ 1878.983819] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1878.983819] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] self.driver.spawn(context, instance, image_meta, [ 1878.983819] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1878.983819] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1878.983819] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1878.983819] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] self._fetch_image_if_missing(context, vi) [ 1878.983819] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1878.983819] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] image_cache(vi, tmp_image_ds_loc) [ 1878.983819] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1878.984314] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] vm_util.copy_virtual_disk( [ 1878.984314] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1878.984314] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] session._wait_for_task(vmdk_copy_task) [ 1878.984314] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1878.984314] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] return self.wait_for_task(task_ref) [ 1878.984314] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1878.984314] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] return evt.wait() [ 1878.984314] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1878.984314] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] result = hub.switch() [ 1878.984314] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1878.984314] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] return self.greenlet.switch() [ 1878.984314] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1878.984314] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] self.f(*self.args, **self.kw) [ 1878.984799] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1878.984799] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] raise exceptions.translate_fault(task_info.error) [ 1878.984799] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1878.984799] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Faults: ['InvalidArgument'] [ 1878.984799] env[68617]: ERROR nova.compute.manager [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] [ 1878.984799] env[68617]: DEBUG nova.compute.utils [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1878.986056] env[68617]: DEBUG nova.compute.manager [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Build of instance 1605028f-5d6d-4ac4-8416-c0465982c53a was re-scheduled: A specified parameter was not correct: fileType [ 1878.986056] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1878.986421] env[68617]: DEBUG nova.compute.manager [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1878.986607] env[68617]: DEBUG nova.compute.manager [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1878.986789] env[68617]: DEBUG nova.compute.manager [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1878.986952] env[68617]: DEBUG nova.network.neutron [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1879.294967] env[68617]: DEBUG nova.network.neutron [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1879.305589] env[68617]: INFO nova.compute.manager [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Took 0.32 seconds to deallocate network for instance. [ 1879.396439] env[68617]: INFO nova.scheduler.client.report [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Deleted allocations for instance 1605028f-5d6d-4ac4-8416-c0465982c53a [ 1879.419385] env[68617]: DEBUG oslo_concurrency.lockutils [None req-8c09fc83-ce86-4ab9-963f-1f17f2578564 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "1605028f-5d6d-4ac4-8416-c0465982c53a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 577.203s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1879.420592] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1dd8aa56-44f4-4ed5-b4c1-0de8a55f937b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "1605028f-5d6d-4ac4-8416-c0465982c53a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 381.477s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1879.420812] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1dd8aa56-44f4-4ed5-b4c1-0de8a55f937b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "1605028f-5d6d-4ac4-8416-c0465982c53a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1879.421022] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1dd8aa56-44f4-4ed5-b4c1-0de8a55f937b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "1605028f-5d6d-4ac4-8416-c0465982c53a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1879.421192] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1dd8aa56-44f4-4ed5-b4c1-0de8a55f937b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "1605028f-5d6d-4ac4-8416-c0465982c53a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1879.423927] env[68617]: INFO nova.compute.manager [None req-1dd8aa56-44f4-4ed5-b4c1-0de8a55f937b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Terminating instance [ 1879.425630] env[68617]: DEBUG nova.compute.manager [None req-1dd8aa56-44f4-4ed5-b4c1-0de8a55f937b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1879.425875] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-1dd8aa56-44f4-4ed5-b4c1-0de8a55f937b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1879.426319] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-0c6aa24e-730f-4eab-be84-66ffc8262f7f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1879.430731] env[68617]: DEBUG nova.compute.manager [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1879.438090] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43316fb9-53d0-4a89-8309-70fd018a455e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1879.468019] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-1dd8aa56-44f4-4ed5-b4c1-0de8a55f937b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 1605028f-5d6d-4ac4-8416-c0465982c53a could not be found. [ 1879.468162] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-1dd8aa56-44f4-4ed5-b4c1-0de8a55f937b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1879.468279] env[68617]: INFO nova.compute.manager [None req-1dd8aa56-44f4-4ed5-b4c1-0de8a55f937b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1879.468521] env[68617]: DEBUG oslo.service.loopingcall [None req-1dd8aa56-44f4-4ed5-b4c1-0de8a55f937b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1879.473032] env[68617]: DEBUG nova.compute.manager [-] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1879.473145] env[68617]: DEBUG nova.network.neutron [-] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1879.484668] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1879.484925] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1879.486494] env[68617]: INFO nova.compute.claims [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1879.497309] env[68617]: DEBUG nova.network.neutron [-] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1879.509031] env[68617]: INFO nova.compute.manager [-] [instance: 1605028f-5d6d-4ac4-8416-c0465982c53a] Took 0.04 seconds to deallocate network for instance. [ 1879.605860] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1dd8aa56-44f4-4ed5-b4c1-0de8a55f937b tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "1605028f-5d6d-4ac4-8416-c0465982c53a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.185s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1879.683985] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edc3641a-20d7-4ab6-9c16-e9156cc0a6bb {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1879.691678] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abb35562-53d3-4d42-a76a-a5ec705480f6 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1879.720908] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5b4287a-545c-41b5-b5eb-f89c2c085bc5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1879.727923] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91477ece-7546-465d-9240-47f5e8201ebf {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1879.740376] env[68617]: DEBUG nova.compute.provider_tree [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1879.748900] env[68617]: DEBUG nova.scheduler.client.report [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1879.761992] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.277s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1879.762462] env[68617]: DEBUG nova.compute.manager [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1879.795182] env[68617]: DEBUG nova.compute.utils [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1879.796549] env[68617]: DEBUG nova.compute.manager [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1879.796712] env[68617]: DEBUG nova.network.neutron [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1879.804049] env[68617]: DEBUG nova.compute.manager [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1879.860311] env[68617]: DEBUG nova.policy [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0a963cfef0574bcfa5d3dac76dfb77e8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '99696364d8b44286801509a3b1ebee8a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 1879.866459] env[68617]: DEBUG nova.compute.manager [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1879.892691] env[68617]: DEBUG nova.virt.hardware [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1879.892984] env[68617]: DEBUG nova.virt.hardware [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1879.893199] env[68617]: DEBUG nova.virt.hardware [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1879.893424] env[68617]: DEBUG nova.virt.hardware [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1879.893614] env[68617]: DEBUG nova.virt.hardware [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1879.893794] env[68617]: DEBUG nova.virt.hardware [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1879.894037] env[68617]: DEBUG nova.virt.hardware [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1879.894228] env[68617]: DEBUG nova.virt.hardware [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1879.894432] env[68617]: DEBUG nova.virt.hardware [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1879.894633] env[68617]: DEBUG nova.virt.hardware [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1879.894839] env[68617]: DEBUG nova.virt.hardware [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1879.896239] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ae1a0f0-f203-4775-9813-6c8fbd5f14f3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1879.904657] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b2deb45-40cb-419b-a2d8-9ef44f2ec1cb {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1880.169212] env[68617]: DEBUG nova.network.neutron [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Successfully created port: 6bfc5d99-089a-470c-99c9-39b22a9a8fdc {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1880.894783] env[68617]: DEBUG nova.network.neutron [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Successfully updated port: 6bfc5d99-089a-470c-99c9-39b22a9a8fdc {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1880.909279] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Acquiring lock "refresh_cache-a4ab788d-327a-47cc-8ae7-e1b9be889759" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1880.909679] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Acquired lock "refresh_cache-a4ab788d-327a-47cc-8ae7-e1b9be889759" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1880.909679] env[68617]: DEBUG nova.network.neutron [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1880.948575] env[68617]: DEBUG nova.network.neutron [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1881.104145] env[68617]: DEBUG nova.network.neutron [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Updating instance_info_cache with network_info: [{"id": "6bfc5d99-089a-470c-99c9-39b22a9a8fdc", "address": "fa:16:3e:c6:95:dc", "network": {"id": "ca1ddb0e-2fab-47c8-8199-b99d8268db3d", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-155280593-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "99696364d8b44286801509a3b1ebee8a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dbd2870d-a51d-472a-8034-1b3e132b5cb6", "external-id": "nsx-vlan-transportzone-101", "segmentation_id": 101, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6bfc5d99-08", "ovs_interfaceid": "6bfc5d99-089a-470c-99c9-39b22a9a8fdc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1881.117023] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Releasing lock "refresh_cache-a4ab788d-327a-47cc-8ae7-e1b9be889759" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1881.117325] env[68617]: DEBUG nova.compute.manager [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Instance network_info: |[{"id": "6bfc5d99-089a-470c-99c9-39b22a9a8fdc", "address": "fa:16:3e:c6:95:dc", "network": {"id": "ca1ddb0e-2fab-47c8-8199-b99d8268db3d", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-155280593-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "99696364d8b44286801509a3b1ebee8a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dbd2870d-a51d-472a-8034-1b3e132b5cb6", "external-id": "nsx-vlan-transportzone-101", "segmentation_id": 101, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6bfc5d99-08", "ovs_interfaceid": "6bfc5d99-089a-470c-99c9-39b22a9a8fdc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1881.117761] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c6:95:dc', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'dbd2870d-a51d-472a-8034-1b3e132b5cb6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6bfc5d99-089a-470c-99c9-39b22a9a8fdc', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1881.125204] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Creating folder: Project (99696364d8b44286801509a3b1ebee8a). Parent ref: group-v693691. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1881.125725] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-804ce9c2-4b43-4d00-a908-f39e3fda3082 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1881.135911] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Created folder: Project (99696364d8b44286801509a3b1ebee8a) in parent group-v693691. [ 1881.136101] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Creating folder: Instances. Parent ref: group-v693792. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1881.136310] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2e598bad-ee71-4827-b390-a9044fd3070f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1881.143526] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Created folder: Instances in parent group-v693792. [ 1881.143759] env[68617]: DEBUG oslo.service.loopingcall [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1881.143941] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1881.144138] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-bdeb363d-2a6c-4e88-bd21-7537eebf8a22 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1881.161634] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1881.161634] env[68617]: value = "task-3470878" [ 1881.161634] env[68617]: _type = "Task" [ 1881.161634] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1881.168439] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470878, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1881.376770] env[68617]: DEBUG nova.compute.manager [req-1ecf5c9c-d599-4033-82ab-b098aaf773eb req-dd7bbd8d-cac5-49b0-9f10-7a8a81ace050 service nova] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Received event network-vif-plugged-6bfc5d99-089a-470c-99c9-39b22a9a8fdc {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1881.377022] env[68617]: DEBUG oslo_concurrency.lockutils [req-1ecf5c9c-d599-4033-82ab-b098aaf773eb req-dd7bbd8d-cac5-49b0-9f10-7a8a81ace050 service nova] Acquiring lock "a4ab788d-327a-47cc-8ae7-e1b9be889759-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1881.377236] env[68617]: DEBUG oslo_concurrency.lockutils [req-1ecf5c9c-d599-4033-82ab-b098aaf773eb req-dd7bbd8d-cac5-49b0-9f10-7a8a81ace050 service nova] Lock "a4ab788d-327a-47cc-8ae7-e1b9be889759-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1881.377409] env[68617]: DEBUG oslo_concurrency.lockutils [req-1ecf5c9c-d599-4033-82ab-b098aaf773eb req-dd7bbd8d-cac5-49b0-9f10-7a8a81ace050 service nova] Lock "a4ab788d-327a-47cc-8ae7-e1b9be889759-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1881.377612] env[68617]: DEBUG nova.compute.manager [req-1ecf5c9c-d599-4033-82ab-b098aaf773eb req-dd7bbd8d-cac5-49b0-9f10-7a8a81ace050 service nova] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] No waiting events found dispatching network-vif-plugged-6bfc5d99-089a-470c-99c9-39b22a9a8fdc {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1881.377789] env[68617]: WARNING nova.compute.manager [req-1ecf5c9c-d599-4033-82ab-b098aaf773eb req-dd7bbd8d-cac5-49b0-9f10-7a8a81ace050 service nova] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Received unexpected event network-vif-plugged-6bfc5d99-089a-470c-99c9-39b22a9a8fdc for instance with vm_state building and task_state spawning. [ 1881.377953] env[68617]: DEBUG nova.compute.manager [req-1ecf5c9c-d599-4033-82ab-b098aaf773eb req-dd7bbd8d-cac5-49b0-9f10-7a8a81ace050 service nova] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Received event network-changed-6bfc5d99-089a-470c-99c9-39b22a9a8fdc {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1881.378124] env[68617]: DEBUG nova.compute.manager [req-1ecf5c9c-d599-4033-82ab-b098aaf773eb req-dd7bbd8d-cac5-49b0-9f10-7a8a81ace050 service nova] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Refreshing instance network info cache due to event network-changed-6bfc5d99-089a-470c-99c9-39b22a9a8fdc. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1881.378311] env[68617]: DEBUG oslo_concurrency.lockutils [req-1ecf5c9c-d599-4033-82ab-b098aaf773eb req-dd7bbd8d-cac5-49b0-9f10-7a8a81ace050 service nova] Acquiring lock "refresh_cache-a4ab788d-327a-47cc-8ae7-e1b9be889759" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1881.378449] env[68617]: DEBUG oslo_concurrency.lockutils [req-1ecf5c9c-d599-4033-82ab-b098aaf773eb req-dd7bbd8d-cac5-49b0-9f10-7a8a81ace050 service nova] Acquired lock "refresh_cache-a4ab788d-327a-47cc-8ae7-e1b9be889759" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1881.378614] env[68617]: DEBUG nova.network.neutron [req-1ecf5c9c-d599-4033-82ab-b098aaf773eb req-dd7bbd8d-cac5-49b0-9f10-7a8a81ace050 service nova] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Refreshing network info cache for port 6bfc5d99-089a-470c-99c9-39b22a9a8fdc {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1881.624804] env[68617]: DEBUG nova.network.neutron [req-1ecf5c9c-d599-4033-82ab-b098aaf773eb req-dd7bbd8d-cac5-49b0-9f10-7a8a81ace050 service nova] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Updated VIF entry in instance network info cache for port 6bfc5d99-089a-470c-99c9-39b22a9a8fdc. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1881.625347] env[68617]: DEBUG nova.network.neutron [req-1ecf5c9c-d599-4033-82ab-b098aaf773eb req-dd7bbd8d-cac5-49b0-9f10-7a8a81ace050 service nova] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Updating instance_info_cache with network_info: [{"id": "6bfc5d99-089a-470c-99c9-39b22a9a8fdc", "address": "fa:16:3e:c6:95:dc", "network": {"id": "ca1ddb0e-2fab-47c8-8199-b99d8268db3d", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-155280593-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "99696364d8b44286801509a3b1ebee8a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dbd2870d-a51d-472a-8034-1b3e132b5cb6", "external-id": "nsx-vlan-transportzone-101", "segmentation_id": 101, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6bfc5d99-08", "ovs_interfaceid": "6bfc5d99-089a-470c-99c9-39b22a9a8fdc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1881.634961] env[68617]: DEBUG oslo_concurrency.lockutils [req-1ecf5c9c-d599-4033-82ab-b098aaf773eb req-dd7bbd8d-cac5-49b0-9f10-7a8a81ace050 service nova] Releasing lock "refresh_cache-a4ab788d-327a-47cc-8ae7-e1b9be889759" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1881.671656] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470878, 'name': CreateVM_Task, 'duration_secs': 0.280463} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1881.671799] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1881.672420] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1881.672590] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1881.672993] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1881.673511] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9fdc04f4-2852-4e49-ae22-429bef8c175f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1881.677878] env[68617]: DEBUG oslo_vmware.api [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Waiting for the task: (returnval){ [ 1881.677878] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]524cf72d-5729-8631-ada9-2d5f3b5befbe" [ 1881.677878] env[68617]: _type = "Task" [ 1881.677878] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1881.685115] env[68617]: DEBUG oslo_vmware.api [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]524cf72d-5729-8631-ada9-2d5f3b5befbe, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1882.188994] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1882.189347] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1882.189453] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1891.821806] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ea2039ab-4bbd-4b5a-af4d-095d8a0fa125 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "b1a8dc60-af98-4f80-96cf-b2550ea8c13a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1894.084474] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1bcc9db0-15b4-4fc7-ac3d-d65126e3f188 tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Acquiring lock "a4ab788d-327a-47cc-8ae7-e1b9be889759" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1903.607050] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._sync_power_states {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1903.628805] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Getting list of instances from cluster (obj){ [ 1903.628805] env[68617]: value = "domain-c8" [ 1903.628805] env[68617]: _type = "ClusterComputeResource" [ 1903.628805] env[68617]: } {{(pid=68617) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1903.630174] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f899f268-670d-4729-bf2d-f997107bf247 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.646890] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Got total of 10 instances {{(pid=68617) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1903.647072] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Triggering sync for uuid fc1043b8-535d-4af0-b92b-1f43580cdc9a {{(pid=68617) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1903.647272] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Triggering sync for uuid 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4 {{(pid=68617) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1903.647435] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Triggering sync for uuid f54002b0-d60e-44ff-82a5-ef2f5193c48c {{(pid=68617) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1903.647590] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Triggering sync for uuid 2c950cba-7698-48e0-8852-bf569f58f967 {{(pid=68617) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1903.647766] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Triggering sync for uuid 12ed2a40-3d74-49a2-95b4-ccaaf58c8060 {{(pid=68617) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1903.647946] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Triggering sync for uuid 21d0560a-fde3-4c16-b2fc-06d6f8668a7a {{(pid=68617) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1903.648124] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Triggering sync for uuid 902b5ab9-23b8-450f-853a-b2da889c3afd {{(pid=68617) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1903.648276] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Triggering sync for uuid 922c8926-c636-4463-85d6-4f2a6325b85a {{(pid=68617) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1903.648426] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Triggering sync for uuid b1a8dc60-af98-4f80-96cf-b2550ea8c13a {{(pid=68617) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1903.648577] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Triggering sync for uuid a4ab788d-327a-47cc-8ae7-e1b9be889759 {{(pid=68617) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1903.648892] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "fc1043b8-535d-4af0-b92b-1f43580cdc9a" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1903.649162] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1903.649365] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "f54002b0-d60e-44ff-82a5-ef2f5193c48c" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1903.649560] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "2c950cba-7698-48e0-8852-bf569f58f967" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1903.649750] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "12ed2a40-3d74-49a2-95b4-ccaaf58c8060" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1903.649945] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "21d0560a-fde3-4c16-b2fc-06d6f8668a7a" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1903.650181] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "902b5ab9-23b8-450f-853a-b2da889c3afd" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1903.650397] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "922c8926-c636-4463-85d6-4f2a6325b85a" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1903.650596] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "b1a8dc60-af98-4f80-96cf-b2550ea8c13a" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1903.650787] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "a4ab788d-327a-47cc-8ae7-e1b9be889759" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1914.743772] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1914.744186] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Starting heal instance info cache {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1914.744439] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Rebuilding the list of instances to heal {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1914.766046] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1914.766218] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1914.766350] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1914.766477] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1914.766613] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1914.766736] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1914.766856] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1914.766974] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1914.767103] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1914.767218] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1914.767336] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Didn't find any instances for network info cache update. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1918.700053] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1918.700053] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1918.700425] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1918.700425] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68617) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1919.256108] env[68617]: DEBUG oslo_concurrency.lockutils [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Acquiring lock "17bb8415-dafd-47ed-9a14-52163ba5e7db" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1919.256281] env[68617]: DEBUG oslo_concurrency.lockutils [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Lock "17bb8415-dafd-47ed-9a14-52163ba5e7db" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1919.700114] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1921.700028] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1922.699143] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1922.711060] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1922.711341] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1922.711500] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1922.711695] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1922.712730] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa598d74-2e0a-4bcf-9bb2-2ab6a422e07f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1922.722683] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43adc193-f92a-49a5-97f7-9916c884de35 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1922.736143] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79c61766-61f9-4134-957f-c909ff1bae4b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1922.742175] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46eec341-d089-40e8-bceb-d76902760ede {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1922.772228] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180875MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1922.772397] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1922.772624] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1922.845825] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance fc1043b8-535d-4af0-b92b-1f43580cdc9a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1922.845987] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1922.846134] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f54002b0-d60e-44ff-82a5-ef2f5193c48c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1922.846260] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 2c950cba-7698-48e0-8852-bf569f58f967 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1922.846378] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 12ed2a40-3d74-49a2-95b4-ccaaf58c8060 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1922.846495] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 21d0560a-fde3-4c16-b2fc-06d6f8668a7a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1922.846669] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 902b5ab9-23b8-450f-853a-b2da889c3afd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1922.846820] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 922c8926-c636-4463-85d6-4f2a6325b85a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1922.846939] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b1a8dc60-af98-4f80-96cf-b2550ea8c13a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1922.847069] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance a4ab788d-327a-47cc-8ae7-e1b9be889759 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1922.859046] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 17bb8415-dafd-47ed-9a14-52163ba5e7db has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1922.859046] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1922.859046] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1922.994436] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4046c92f-4c0a-47d8-a5a9-8416f1dbf3b1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1923.001638] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-376f6810-a8f1-4d93-b0b5-59ec6ba81cb3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1923.031882] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f153b08-2bc6-4d4e-b2fb-c4ff6b629643 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1923.038705] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da75eb66-6268-410e-b162-bb8799cf63a0 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1923.051182] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1923.078049] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1923.092261] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1923.092510] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.320s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1924.169652] env[68617]: WARNING oslo_vmware.rw_handles [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1924.169652] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1924.169652] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1924.169652] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1924.169652] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1924.169652] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 1924.169652] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1924.169652] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1924.169652] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1924.169652] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1924.169652] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1924.169652] env[68617]: ERROR oslo_vmware.rw_handles [ 1924.169652] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/3bc78724-87b5-4bc6-a5cd-4ab917212c2b/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1924.171900] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1924.172172] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Copying Virtual Disk [datastore2] vmware_temp/3bc78724-87b5-4bc6-a5cd-4ab917212c2b/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/3bc78724-87b5-4bc6-a5cd-4ab917212c2b/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1924.172452] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-7c483176-ff92-4ccd-b634-070326027e96 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1924.180380] env[68617]: DEBUG oslo_vmware.api [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Waiting for the task: (returnval){ [ 1924.180380] env[68617]: value = "task-3470879" [ 1924.180380] env[68617]: _type = "Task" [ 1924.180380] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1924.188340] env[68617]: DEBUG oslo_vmware.api [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Task: {'id': task-3470879, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1924.691376] env[68617]: DEBUG oslo_vmware.exceptions [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1924.691667] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1924.692272] env[68617]: ERROR nova.compute.manager [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1924.692272] env[68617]: Faults: ['InvalidArgument'] [ 1924.692272] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Traceback (most recent call last): [ 1924.692272] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1924.692272] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] yield resources [ 1924.692272] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1924.692272] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] self.driver.spawn(context, instance, image_meta, [ 1924.692272] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1924.692272] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1924.692272] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1924.692272] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] self._fetch_image_if_missing(context, vi) [ 1924.692272] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1924.692666] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] image_cache(vi, tmp_image_ds_loc) [ 1924.692666] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1924.692666] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] vm_util.copy_virtual_disk( [ 1924.692666] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1924.692666] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] session._wait_for_task(vmdk_copy_task) [ 1924.692666] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1924.692666] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] return self.wait_for_task(task_ref) [ 1924.692666] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1924.692666] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] return evt.wait() [ 1924.692666] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1924.692666] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] result = hub.switch() [ 1924.692666] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1924.692666] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] return self.greenlet.switch() [ 1924.693083] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1924.693083] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] self.f(*self.args, **self.kw) [ 1924.693083] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1924.693083] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] raise exceptions.translate_fault(task_info.error) [ 1924.693083] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1924.693083] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Faults: ['InvalidArgument'] [ 1924.693083] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] [ 1924.693083] env[68617]: INFO nova.compute.manager [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Terminating instance [ 1924.694389] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1924.694389] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1924.694931] env[68617]: DEBUG nova.compute.manager [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1924.695135] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1924.695362] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-01cbebbe-433e-4fcf-a780-304eaaa9a0b3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1924.697624] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15b609c8-79cd-4878-841e-75c752ab38fc {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1924.704204] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1924.704423] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-160df010-5954-48bf-b088-ff8bb48b333a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1924.706546] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1924.706716] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1924.707634] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fbd261c2-dec9-42d0-85c4-2c40ecb811ff {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1924.711998] env[68617]: DEBUG oslo_vmware.api [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Waiting for the task: (returnval){ [ 1924.711998] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52ddf9a8-9271-0ab8-60fc-eddf4594ebc5" [ 1924.711998] env[68617]: _type = "Task" [ 1924.711998] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1924.719303] env[68617]: DEBUG oslo_vmware.api [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52ddf9a8-9271-0ab8-60fc-eddf4594ebc5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1924.772420] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1924.772642] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1924.772820] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Deleting the datastore file [datastore2] fc1043b8-535d-4af0-b92b-1f43580cdc9a {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1924.773109] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6cf979c3-e270-4a2d-a1e4-8005a8ab0b14 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1924.779291] env[68617]: DEBUG oslo_vmware.api [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Waiting for the task: (returnval){ [ 1924.779291] env[68617]: value = "task-3470881" [ 1924.779291] env[68617]: _type = "Task" [ 1924.779291] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1924.787053] env[68617]: DEBUG oslo_vmware.api [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Task: {'id': task-3470881, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1925.093035] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1925.222267] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1925.222643] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Creating directory with path [datastore2] vmware_temp/ceb56d4b-6551-405f-b54b-f3a2644572d3/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1925.222733] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c63f12cb-b6dc-40bc-a1e7-34f1411a9f71 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1925.233990] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Created directory with path [datastore2] vmware_temp/ceb56d4b-6551-405f-b54b-f3a2644572d3/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1925.234204] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Fetch image to [datastore2] vmware_temp/ceb56d4b-6551-405f-b54b-f3a2644572d3/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1925.234374] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/ceb56d4b-6551-405f-b54b-f3a2644572d3/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1925.235096] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be9a312e-fd1a-459e-9f49-855fe3a6aad5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1925.242643] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79217b93-ea3c-4a9e-b100-fc1198b17f5f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1925.251285] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e0e33dc-4df5-4dcd-a0f1-787243a30778 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1925.284151] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd655481-07f6-4e63-865a-c7664ff8d4c2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1925.291313] env[68617]: DEBUG oslo_vmware.api [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Task: {'id': task-3470881, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079937} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1925.292859] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1925.293069] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1925.293249] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1925.293425] env[68617]: INFO nova.compute.manager [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1925.295243] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-58ddd7e4-b455-4d59-8596-8eefb4cce5ed {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1925.297146] env[68617]: DEBUG nova.compute.claims [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1925.297353] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1925.297570] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1925.322925] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1925.456869] env[68617]: DEBUG oslo_vmware.rw_handles [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ceb56d4b-6551-405f-b54b-f3a2644572d3/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1925.516397] env[68617]: DEBUG oslo_vmware.rw_handles [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1925.516625] env[68617]: DEBUG oslo_vmware.rw_handles [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ceb56d4b-6551-405f-b54b-f3a2644572d3/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1925.528150] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a409bfcc-9776-44d0-a58a-5d3666673005 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1925.536191] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6198a2ee-3c66-4fe0-9dd5-70e1c9a7930d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1925.566565] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23dbefb9-68bd-43ab-b795-aadcd30e122b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1925.573703] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e840959-db16-4871-a863-4d58e5209acb {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1925.586541] env[68617]: DEBUG nova.compute.provider_tree [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1925.595434] env[68617]: DEBUG nova.scheduler.client.report [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1925.609068] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.311s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1925.609606] env[68617]: ERROR nova.compute.manager [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1925.609606] env[68617]: Faults: ['InvalidArgument'] [ 1925.609606] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Traceback (most recent call last): [ 1925.609606] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1925.609606] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] self.driver.spawn(context, instance, image_meta, [ 1925.609606] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1925.609606] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1925.609606] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1925.609606] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] self._fetch_image_if_missing(context, vi) [ 1925.609606] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1925.609606] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] image_cache(vi, tmp_image_ds_loc) [ 1925.609606] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1925.609960] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] vm_util.copy_virtual_disk( [ 1925.609960] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1925.609960] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] session._wait_for_task(vmdk_copy_task) [ 1925.609960] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1925.609960] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] return self.wait_for_task(task_ref) [ 1925.609960] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1925.609960] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] return evt.wait() [ 1925.609960] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1925.609960] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] result = hub.switch() [ 1925.609960] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1925.609960] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] return self.greenlet.switch() [ 1925.609960] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1925.609960] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] self.f(*self.args, **self.kw) [ 1925.610645] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1925.610645] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] raise exceptions.translate_fault(task_info.error) [ 1925.610645] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1925.610645] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Faults: ['InvalidArgument'] [ 1925.610645] env[68617]: ERROR nova.compute.manager [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] [ 1925.610645] env[68617]: DEBUG nova.compute.utils [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1925.611795] env[68617]: DEBUG nova.compute.manager [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Build of instance fc1043b8-535d-4af0-b92b-1f43580cdc9a was re-scheduled: A specified parameter was not correct: fileType [ 1925.611795] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1925.613024] env[68617]: DEBUG nova.compute.manager [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1925.613024] env[68617]: DEBUG nova.compute.manager [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1925.613024] env[68617]: DEBUG nova.compute.manager [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1925.613024] env[68617]: DEBUG nova.network.neutron [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1925.694341] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1925.941867] env[68617]: DEBUG nova.network.neutron [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1925.951632] env[68617]: INFO nova.compute.manager [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Took 0.34 seconds to deallocate network for instance. [ 1926.053224] env[68617]: INFO nova.scheduler.client.report [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Deleted allocations for instance fc1043b8-535d-4af0-b92b-1f43580cdc9a [ 1926.073508] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2ef5f8f1-c872-4672-9911-d8b6c8a2ea17 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Lock "fc1043b8-535d-4af0-b92b-1f43580cdc9a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 600.228s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1926.074650] env[68617]: DEBUG oslo_concurrency.lockutils [None req-40dee1cc-64c4-43cd-b779-cde0e29e04da tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Lock "fc1043b8-535d-4af0-b92b-1f43580cdc9a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 403.284s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1926.074868] env[68617]: DEBUG oslo_concurrency.lockutils [None req-40dee1cc-64c4-43cd-b779-cde0e29e04da tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Acquiring lock "fc1043b8-535d-4af0-b92b-1f43580cdc9a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1926.075081] env[68617]: DEBUG oslo_concurrency.lockutils [None req-40dee1cc-64c4-43cd-b779-cde0e29e04da tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Lock "fc1043b8-535d-4af0-b92b-1f43580cdc9a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1926.075254] env[68617]: DEBUG oslo_concurrency.lockutils [None req-40dee1cc-64c4-43cd-b779-cde0e29e04da tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Lock "fc1043b8-535d-4af0-b92b-1f43580cdc9a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1926.077503] env[68617]: INFO nova.compute.manager [None req-40dee1cc-64c4-43cd-b779-cde0e29e04da tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Terminating instance [ 1926.079236] env[68617]: DEBUG nova.compute.manager [None req-40dee1cc-64c4-43cd-b779-cde0e29e04da tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1926.079437] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-40dee1cc-64c4-43cd-b779-cde0e29e04da tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1926.080118] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b8c5daf0-0ef1-44ea-8e7d-2bf04325baa7 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1926.089882] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cc7d0f9-0fc8-4753-9317-6f029045124b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1926.103082] env[68617]: DEBUG nova.compute.manager [None req-2b706b28-1b8c-4103-9ebc-58c321e14b9f tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fe0d64a6-6ce6-4ef5-8ae1-a160c5ec0987] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1926.119254] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-40dee1cc-64c4-43cd-b779-cde0e29e04da tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance fc1043b8-535d-4af0-b92b-1f43580cdc9a could not be found. [ 1926.119479] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-40dee1cc-64c4-43cd-b779-cde0e29e04da tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1926.119662] env[68617]: INFO nova.compute.manager [None req-40dee1cc-64c4-43cd-b779-cde0e29e04da tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1926.119900] env[68617]: DEBUG oslo.service.loopingcall [None req-40dee1cc-64c4-43cd-b779-cde0e29e04da tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1926.120157] env[68617]: DEBUG nova.compute.manager [-] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1926.120252] env[68617]: DEBUG nova.network.neutron [-] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1926.128879] env[68617]: DEBUG nova.compute.manager [None req-2b706b28-1b8c-4103-9ebc-58c321e14b9f tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: fe0d64a6-6ce6-4ef5-8ae1-a160c5ec0987] Instance disappeared before build. {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1926.143583] env[68617]: DEBUG nova.network.neutron [-] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1926.155286] env[68617]: INFO nova.compute.manager [-] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] Took 0.03 seconds to deallocate network for instance. [ 1926.163234] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2b706b28-1b8c-4103-9ebc-58c321e14b9f tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Lock "fe0d64a6-6ce6-4ef5-8ae1-a160c5ec0987" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 204.576s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1926.171530] env[68617]: DEBUG nova.compute.manager [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1926.221693] env[68617]: DEBUG oslo_concurrency.lockutils [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1926.221942] env[68617]: DEBUG oslo_concurrency.lockutils [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1926.223558] env[68617]: INFO nova.compute.claims [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1926.247285] env[68617]: DEBUG oslo_concurrency.lockutils [None req-40dee1cc-64c4-43cd-b779-cde0e29e04da tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Lock "fc1043b8-535d-4af0-b92b-1f43580cdc9a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.173s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1926.248097] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "fc1043b8-535d-4af0-b92b-1f43580cdc9a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 22.599s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1926.248287] env[68617]: INFO nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: fc1043b8-535d-4af0-b92b-1f43580cdc9a] During sync_power_state the instance has a pending task (deleting). Skip. [ 1926.248457] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "fc1043b8-535d-4af0-b92b-1f43580cdc9a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1926.392086] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae6aba28-2227-4dbf-93e3-b496ac5b45fd {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1926.399597] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1439b65d-4f8c-4070-b048-3e925dc261a0 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1926.430519] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed06570e-9499-4d9b-8bb8-061c9186c4db {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1926.438152] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8375870-2847-4440-8f16-f2a8702aad28 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1926.454339] env[68617]: DEBUG nova.compute.provider_tree [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1926.462812] env[68617]: DEBUG nova.scheduler.client.report [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1926.478681] env[68617]: DEBUG oslo_concurrency.lockutils [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.256s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1926.479009] env[68617]: DEBUG nova.compute.manager [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1926.524800] env[68617]: DEBUG nova.compute.utils [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1926.526848] env[68617]: DEBUG nova.compute.manager [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1926.527013] env[68617]: DEBUG nova.network.neutron [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1926.537779] env[68617]: DEBUG nova.compute.manager [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1926.589569] env[68617]: DEBUG nova.policy [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '224a4101e01748579f093e7116ca2a1a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b191b6855afd48fb9335661e492e3d39', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 1926.619865] env[68617]: DEBUG nova.compute.manager [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1926.644846] env[68617]: DEBUG nova.virt.hardware [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1926.645118] env[68617]: DEBUG nova.virt.hardware [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1926.645278] env[68617]: DEBUG nova.virt.hardware [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1926.645673] env[68617]: DEBUG nova.virt.hardware [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1926.645673] env[68617]: DEBUG nova.virt.hardware [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1926.645808] env[68617]: DEBUG nova.virt.hardware [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1926.645953] env[68617]: DEBUG nova.virt.hardware [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1926.646123] env[68617]: DEBUG nova.virt.hardware [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1926.646291] env[68617]: DEBUG nova.virt.hardware [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1926.646450] env[68617]: DEBUG nova.virt.hardware [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1926.646621] env[68617]: DEBUG nova.virt.hardware [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1926.647537] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9dd8622-c306-4b6f-93ad-4ebb5a6c8a03 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1926.656150] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-307bf180-19ff-48c2-a0e6-e2722dfc549c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1926.914061] env[68617]: DEBUG nova.network.neutron [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Successfully created port: a965ae24-889e-432b-948a-a5a34007f554 {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1927.490767] env[68617]: DEBUG nova.network.neutron [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Successfully updated port: a965ae24-889e-432b-948a-a5a34007f554 {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1927.505817] env[68617]: DEBUG oslo_concurrency.lockutils [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Acquiring lock "refresh_cache-17bb8415-dafd-47ed-9a14-52163ba5e7db" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1927.505972] env[68617]: DEBUG oslo_concurrency.lockutils [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Acquired lock "refresh_cache-17bb8415-dafd-47ed-9a14-52163ba5e7db" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1927.506153] env[68617]: DEBUG nova.network.neutron [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1927.552803] env[68617]: DEBUG nova.network.neutron [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1927.727542] env[68617]: DEBUG nova.network.neutron [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Updating instance_info_cache with network_info: [{"id": "a965ae24-889e-432b-948a-a5a34007f554", "address": "fa:16:3e:80:96:ff", "network": {"id": "6d8ddf36-28a8-4ec5-8fb8-d3577062a14c", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-512079755-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b191b6855afd48fb9335661e492e3d39", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ce17e10e-2fb0-4191-afee-e2b89fa15074", "external-id": "nsx-vlan-transportzone-352", "segmentation_id": 352, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa965ae24-88", "ovs_interfaceid": "a965ae24-889e-432b-948a-a5a34007f554", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1927.741588] env[68617]: DEBUG oslo_concurrency.lockutils [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Releasing lock "refresh_cache-17bb8415-dafd-47ed-9a14-52163ba5e7db" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1927.741893] env[68617]: DEBUG nova.compute.manager [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Instance network_info: |[{"id": "a965ae24-889e-432b-948a-a5a34007f554", "address": "fa:16:3e:80:96:ff", "network": {"id": "6d8ddf36-28a8-4ec5-8fb8-d3577062a14c", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-512079755-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b191b6855afd48fb9335661e492e3d39", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ce17e10e-2fb0-4191-afee-e2b89fa15074", "external-id": "nsx-vlan-transportzone-352", "segmentation_id": 352, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa965ae24-88", "ovs_interfaceid": "a965ae24-889e-432b-948a-a5a34007f554", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1927.742452] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:80:96:ff', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ce17e10e-2fb0-4191-afee-e2b89fa15074', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a965ae24-889e-432b-948a-a5a34007f554', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1927.750349] env[68617]: DEBUG oslo.service.loopingcall [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1927.750841] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1927.751088] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a14612cd-6052-4d2e-9e27-3a0d8f149da9 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1927.771626] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1927.771626] env[68617]: value = "task-3470882" [ 1927.771626] env[68617]: _type = "Task" [ 1927.771626] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1927.779539] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470882, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1928.007061] env[68617]: DEBUG nova.compute.manager [req-bf10508d-f216-4d43-8ac4-ecdfe3e66cbc req-e47255df-3e89-4eee-8d01-cfb1e4564e2b service nova] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Received event network-vif-plugged-a965ae24-889e-432b-948a-a5a34007f554 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1928.007380] env[68617]: DEBUG oslo_concurrency.lockutils [req-bf10508d-f216-4d43-8ac4-ecdfe3e66cbc req-e47255df-3e89-4eee-8d01-cfb1e4564e2b service nova] Acquiring lock "17bb8415-dafd-47ed-9a14-52163ba5e7db-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1928.007600] env[68617]: DEBUG oslo_concurrency.lockutils [req-bf10508d-f216-4d43-8ac4-ecdfe3e66cbc req-e47255df-3e89-4eee-8d01-cfb1e4564e2b service nova] Lock "17bb8415-dafd-47ed-9a14-52163ba5e7db-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1928.007773] env[68617]: DEBUG oslo_concurrency.lockutils [req-bf10508d-f216-4d43-8ac4-ecdfe3e66cbc req-e47255df-3e89-4eee-8d01-cfb1e4564e2b service nova] Lock "17bb8415-dafd-47ed-9a14-52163ba5e7db-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1928.007969] env[68617]: DEBUG nova.compute.manager [req-bf10508d-f216-4d43-8ac4-ecdfe3e66cbc req-e47255df-3e89-4eee-8d01-cfb1e4564e2b service nova] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] No waiting events found dispatching network-vif-plugged-a965ae24-889e-432b-948a-a5a34007f554 {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1928.008187] env[68617]: WARNING nova.compute.manager [req-bf10508d-f216-4d43-8ac4-ecdfe3e66cbc req-e47255df-3e89-4eee-8d01-cfb1e4564e2b service nova] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Received unexpected event network-vif-plugged-a965ae24-889e-432b-948a-a5a34007f554 for instance with vm_state building and task_state spawning. [ 1928.008413] env[68617]: DEBUG nova.compute.manager [req-bf10508d-f216-4d43-8ac4-ecdfe3e66cbc req-e47255df-3e89-4eee-8d01-cfb1e4564e2b service nova] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Received event network-changed-a965ae24-889e-432b-948a-a5a34007f554 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1928.008537] env[68617]: DEBUG nova.compute.manager [req-bf10508d-f216-4d43-8ac4-ecdfe3e66cbc req-e47255df-3e89-4eee-8d01-cfb1e4564e2b service nova] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Refreshing instance network info cache due to event network-changed-a965ae24-889e-432b-948a-a5a34007f554. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1928.008729] env[68617]: DEBUG oslo_concurrency.lockutils [req-bf10508d-f216-4d43-8ac4-ecdfe3e66cbc req-e47255df-3e89-4eee-8d01-cfb1e4564e2b service nova] Acquiring lock "refresh_cache-17bb8415-dafd-47ed-9a14-52163ba5e7db" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1928.008879] env[68617]: DEBUG oslo_concurrency.lockutils [req-bf10508d-f216-4d43-8ac4-ecdfe3e66cbc req-e47255df-3e89-4eee-8d01-cfb1e4564e2b service nova] Acquired lock "refresh_cache-17bb8415-dafd-47ed-9a14-52163ba5e7db" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1928.009062] env[68617]: DEBUG nova.network.neutron [req-bf10508d-f216-4d43-8ac4-ecdfe3e66cbc req-e47255df-3e89-4eee-8d01-cfb1e4564e2b service nova] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Refreshing network info cache for port a965ae24-889e-432b-948a-a5a34007f554 {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1928.283072] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470882, 'name': CreateVM_Task, 'duration_secs': 0.291073} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1928.283240] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1928.283874] env[68617]: DEBUG oslo_concurrency.lockutils [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1928.284035] env[68617]: DEBUG oslo_concurrency.lockutils [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1928.284357] env[68617]: DEBUG oslo_concurrency.lockutils [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1928.284595] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c75b8293-821d-4111-a894-ee03dac225a6 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1928.289247] env[68617]: DEBUG oslo_vmware.api [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Waiting for the task: (returnval){ [ 1928.289247] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52c51f61-ba52-a21a-0324-3aa6f852c295" [ 1928.289247] env[68617]: _type = "Task" [ 1928.289247] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1928.297828] env[68617]: DEBUG oslo_vmware.api [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52c51f61-ba52-a21a-0324-3aa6f852c295, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1928.408806] env[68617]: DEBUG nova.network.neutron [req-bf10508d-f216-4d43-8ac4-ecdfe3e66cbc req-e47255df-3e89-4eee-8d01-cfb1e4564e2b service nova] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Updated VIF entry in instance network info cache for port a965ae24-889e-432b-948a-a5a34007f554. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1928.409216] env[68617]: DEBUG nova.network.neutron [req-bf10508d-f216-4d43-8ac4-ecdfe3e66cbc req-e47255df-3e89-4eee-8d01-cfb1e4564e2b service nova] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Updating instance_info_cache with network_info: [{"id": "a965ae24-889e-432b-948a-a5a34007f554", "address": "fa:16:3e:80:96:ff", "network": {"id": "6d8ddf36-28a8-4ec5-8fb8-d3577062a14c", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-512079755-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b191b6855afd48fb9335661e492e3d39", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ce17e10e-2fb0-4191-afee-e2b89fa15074", "external-id": "nsx-vlan-transportzone-352", "segmentation_id": 352, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa965ae24-88", "ovs_interfaceid": "a965ae24-889e-432b-948a-a5a34007f554", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1928.418405] env[68617]: DEBUG oslo_concurrency.lockutils [req-bf10508d-f216-4d43-8ac4-ecdfe3e66cbc req-e47255df-3e89-4eee-8d01-cfb1e4564e2b service nova] Releasing lock "refresh_cache-17bb8415-dafd-47ed-9a14-52163ba5e7db" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1928.800217] env[68617]: DEBUG oslo_concurrency.lockutils [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1928.800539] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1928.800683] env[68617]: DEBUG oslo_concurrency.lockutils [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1931.767785] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "82f72313-f493-4acd-a95e-765feb74a358" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1931.768136] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "82f72313-f493-4acd-a95e-765feb74a358" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1942.102659] env[68617]: DEBUG oslo_concurrency.lockutils [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquiring lock "797b434e-a913-43dc-a1df-39fe82da1221" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1942.102965] env[68617]: DEBUG oslo_concurrency.lockutils [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Lock "797b434e-a913-43dc-a1df-39fe82da1221" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1974.186227] env[68617]: WARNING oslo_vmware.rw_handles [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1974.186227] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1974.186227] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1974.186227] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1974.186227] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1974.186227] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 1974.186227] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1974.186227] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1974.186227] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1974.186227] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1974.186227] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1974.186227] env[68617]: ERROR oslo_vmware.rw_handles [ 1974.187072] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/ceb56d4b-6551-405f-b54b-f3a2644572d3/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1974.189757] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1974.190028] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Copying Virtual Disk [datastore2] vmware_temp/ceb56d4b-6551-405f-b54b-f3a2644572d3/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/ceb56d4b-6551-405f-b54b-f3a2644572d3/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1974.190387] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-0082c6ed-f47e-4d9a-9e77-492f2db48417 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1974.199150] env[68617]: DEBUG oslo_vmware.api [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Waiting for the task: (returnval){ [ 1974.199150] env[68617]: value = "task-3470883" [ 1974.199150] env[68617]: _type = "Task" [ 1974.199150] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1974.207541] env[68617]: DEBUG oslo_vmware.api [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Task: {'id': task-3470883, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1974.710043] env[68617]: DEBUG oslo_vmware.exceptions [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1974.710363] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1974.710895] env[68617]: ERROR nova.compute.manager [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1974.710895] env[68617]: Faults: ['InvalidArgument'] [ 1974.710895] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Traceback (most recent call last): [ 1974.710895] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1974.710895] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] yield resources [ 1974.710895] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1974.710895] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] self.driver.spawn(context, instance, image_meta, [ 1974.710895] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1974.710895] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1974.710895] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1974.710895] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] self._fetch_image_if_missing(context, vi) [ 1974.710895] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1974.711818] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] image_cache(vi, tmp_image_ds_loc) [ 1974.711818] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1974.711818] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] vm_util.copy_virtual_disk( [ 1974.711818] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1974.711818] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] session._wait_for_task(vmdk_copy_task) [ 1974.711818] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1974.711818] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] return self.wait_for_task(task_ref) [ 1974.711818] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1974.711818] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] return evt.wait() [ 1974.711818] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1974.711818] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] result = hub.switch() [ 1974.711818] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1974.711818] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] return self.greenlet.switch() [ 1974.712489] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1974.712489] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] self.f(*self.args, **self.kw) [ 1974.712489] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1974.712489] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] raise exceptions.translate_fault(task_info.error) [ 1974.712489] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1974.712489] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Faults: ['InvalidArgument'] [ 1974.712489] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] [ 1974.712489] env[68617]: INFO nova.compute.manager [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Terminating instance [ 1974.712904] env[68617]: DEBUG oslo_concurrency.lockutils [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1974.713083] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1974.713305] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5615291f-e41b-4bd6-a701-b0d3cca863f0 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1974.715867] env[68617]: DEBUG nova.compute.manager [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1974.716076] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1974.716858] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-231c2140-e799-4f1d-9ef3-c1945f6e699e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1974.723903] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1974.724170] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-8603899a-f987-40ce-af93-017b013ae2a5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1974.726590] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1974.726768] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1974.727796] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-dce1f107-c45d-42c7-ad68-3cb60a8946f3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1974.732817] env[68617]: DEBUG oslo_vmware.api [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Waiting for the task: (returnval){ [ 1974.732817] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52e561a6-da7c-7d28-96ff-af9896c1cb44" [ 1974.732817] env[68617]: _type = "Task" [ 1974.732817] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1974.742112] env[68617]: DEBUG oslo_vmware.api [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52e561a6-da7c-7d28-96ff-af9896c1cb44, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1974.797627] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1974.797867] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1974.798023] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Deleting the datastore file [datastore2] 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4 {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1974.798350] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c1bb092d-ff26-46e7-9018-6c519f3ca825 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1974.804805] env[68617]: DEBUG oslo_vmware.api [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Waiting for the task: (returnval){ [ 1974.804805] env[68617]: value = "task-3470885" [ 1974.804805] env[68617]: _type = "Task" [ 1974.804805] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1974.813886] env[68617]: DEBUG oslo_vmware.api [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Task: {'id': task-3470885, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1975.243340] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1975.243700] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Creating directory with path [datastore2] vmware_temp/9eaac455-b622-4d28-9e3a-13d79c482e6c/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1975.243829] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4f3ad1eb-b0f5-4aca-9a50-b20686f24071 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1975.254679] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Created directory with path [datastore2] vmware_temp/9eaac455-b622-4d28-9e3a-13d79c482e6c/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1975.254920] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Fetch image to [datastore2] vmware_temp/9eaac455-b622-4d28-9e3a-13d79c482e6c/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1975.255022] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/9eaac455-b622-4d28-9e3a-13d79c482e6c/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1975.255737] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f48bb7c-7066-4608-9ef7-d93f93a3663e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1975.262385] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ac79a97-fb06-4228-8d08-138d69a03501 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1975.271120] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e71f105-0219-4426-a21c-e61aa160b5dd {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1975.301054] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bfc901a-4e83-4c25-9989-f6d462c1220d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1975.308753] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-fada7893-cc47-4d51-b41c-457cd5c5c567 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1975.314771] env[68617]: DEBUG oslo_vmware.api [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Task: {'id': task-3470885, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.078985} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1975.314996] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1975.315192] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1975.315359] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1975.315530] env[68617]: INFO nova.compute.manager [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1975.317532] env[68617]: DEBUG nova.compute.claims [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1975.317707] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1975.317921] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1975.335018] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1975.387401] env[68617]: DEBUG oslo_vmware.rw_handles [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9eaac455-b622-4d28-9e3a-13d79c482e6c/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1975.450406] env[68617]: DEBUG oslo_vmware.rw_handles [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1975.450406] env[68617]: DEBUG oslo_vmware.rw_handles [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9eaac455-b622-4d28-9e3a-13d79c482e6c/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1975.579862] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3793f110-2839-4895-9826-5f550c8176dc {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1975.587721] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca87cf92-b8ac-4162-b947-4b34f532b958 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1975.617693] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d609790-c4ee-430e-9f9d-0a50e872aafd {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1975.624633] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a8e5120-85e5-4ae4-b388-7be346f8f8ac {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1975.637260] env[68617]: DEBUG nova.compute.provider_tree [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1975.645552] env[68617]: DEBUG nova.scheduler.client.report [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1975.659877] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.342s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1975.660410] env[68617]: ERROR nova.compute.manager [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1975.660410] env[68617]: Faults: ['InvalidArgument'] [ 1975.660410] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Traceback (most recent call last): [ 1975.660410] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1975.660410] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] self.driver.spawn(context, instance, image_meta, [ 1975.660410] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1975.660410] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1975.660410] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1975.660410] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] self._fetch_image_if_missing(context, vi) [ 1975.660410] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1975.660410] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] image_cache(vi, tmp_image_ds_loc) [ 1975.660410] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1975.660830] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] vm_util.copy_virtual_disk( [ 1975.660830] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1975.660830] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] session._wait_for_task(vmdk_copy_task) [ 1975.660830] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1975.660830] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] return self.wait_for_task(task_ref) [ 1975.660830] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1975.660830] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] return evt.wait() [ 1975.660830] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1975.660830] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] result = hub.switch() [ 1975.660830] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1975.660830] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] return self.greenlet.switch() [ 1975.660830] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1975.660830] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] self.f(*self.args, **self.kw) [ 1975.661270] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1975.661270] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] raise exceptions.translate_fault(task_info.error) [ 1975.661270] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1975.661270] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Faults: ['InvalidArgument'] [ 1975.661270] env[68617]: ERROR nova.compute.manager [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] [ 1975.661270] env[68617]: DEBUG nova.compute.utils [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1975.662534] env[68617]: DEBUG nova.compute.manager [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Build of instance 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4 was re-scheduled: A specified parameter was not correct: fileType [ 1975.662534] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1975.662900] env[68617]: DEBUG nova.compute.manager [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1975.663085] env[68617]: DEBUG nova.compute.manager [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1975.663259] env[68617]: DEBUG nova.compute.manager [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1975.663427] env[68617]: DEBUG nova.network.neutron [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1975.698203] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1975.698444] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Starting heal instance info cache {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1975.698527] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Rebuilding the list of instances to heal {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1975.717068] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1975.717487] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1975.717487] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1975.717487] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1975.717668] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1975.717703] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1975.717809] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1975.717921] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1975.718052] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1975.718360] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Didn't find any instances for network info cache update. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1975.960122] env[68617]: DEBUG nova.network.neutron [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1975.974040] env[68617]: INFO nova.compute.manager [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Took 0.31 seconds to deallocate network for instance. [ 1976.069547] env[68617]: INFO nova.scheduler.client.report [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Deleted allocations for instance 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4 [ 1976.097899] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ff063994-7857-4cd3-a007-5295e8524c8c tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Lock "6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 634.833s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1976.099215] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d2d19b3a-6461-4472-b37a-867a0762c7fc tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Lock "6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 438.438s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1976.099453] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d2d19b3a-6461-4472-b37a-867a0762c7fc tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Acquiring lock "6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1976.099661] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d2d19b3a-6461-4472-b37a-867a0762c7fc tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Lock "6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1976.099826] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d2d19b3a-6461-4472-b37a-867a0762c7fc tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Lock "6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1976.101780] env[68617]: INFO nova.compute.manager [None req-d2d19b3a-6461-4472-b37a-867a0762c7fc tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Terminating instance [ 1976.103474] env[68617]: DEBUG nova.compute.manager [None req-d2d19b3a-6461-4472-b37a-867a0762c7fc tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1976.103669] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d2d19b3a-6461-4472-b37a-867a0762c7fc tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1976.104168] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f9b0b508-1c42-4ae7-882a-8efb5dd74e87 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1976.113782] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c00866b-2774-4661-9331-2e8e10af17a2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1976.124030] env[68617]: DEBUG nova.compute.manager [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1976.145433] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-d2d19b3a-6461-4472-b37a-867a0762c7fc tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4 could not be found. [ 1976.145679] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d2d19b3a-6461-4472-b37a-867a0762c7fc tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1976.145861] env[68617]: INFO nova.compute.manager [None req-d2d19b3a-6461-4472-b37a-867a0762c7fc tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1976.146131] env[68617]: DEBUG oslo.service.loopingcall [None req-d2d19b3a-6461-4472-b37a-867a0762c7fc tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1976.146374] env[68617]: DEBUG nova.compute.manager [-] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1976.146479] env[68617]: DEBUG nova.network.neutron [-] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1976.170352] env[68617]: DEBUG nova.network.neutron [-] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1976.171924] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1976.172165] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1976.173562] env[68617]: INFO nova.compute.claims [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1976.177867] env[68617]: INFO nova.compute.manager [-] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] Took 0.03 seconds to deallocate network for instance. [ 1976.270829] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d2d19b3a-6461-4472-b37a-867a0762c7fc tempest-MultipleCreateTestJSON-1212752417 tempest-MultipleCreateTestJSON-1212752417-project-member] Lock "6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.172s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1976.271661] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 72.622s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1976.271846] env[68617]: INFO nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4] During sync_power_state the instance has a pending task (deleting). Skip. [ 1976.272027] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "6a1e1d9d-bae1-439c-9a6d-46dd3bf26ee4" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1976.355761] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f008ea0c-3266-4b57-af51-1caa60239cf7 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1976.363948] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51ace49f-d534-4139-b57c-1de8fb0ef57b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1976.392462] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e4b9847-139f-41f7-914c-ca0bde838918 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1976.399145] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8593737f-4140-47e0-8917-090904b130d7 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1976.412624] env[68617]: DEBUG nova.compute.provider_tree [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1976.422847] env[68617]: DEBUG nova.scheduler.client.report [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1976.440337] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.268s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1976.440960] env[68617]: DEBUG nova.compute.manager [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1976.482877] env[68617]: DEBUG nova.compute.utils [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1976.484166] env[68617]: DEBUG nova.compute.manager [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1976.484369] env[68617]: DEBUG nova.network.neutron [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1976.493052] env[68617]: DEBUG nova.compute.manager [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1976.555885] env[68617]: DEBUG nova.policy [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '11eecc8f059e410cb97bafaadc378f89', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4de7b27e9cf04c16b8dee80e756404fd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 1976.562145] env[68617]: DEBUG nova.compute.manager [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1976.592278] env[68617]: DEBUG nova.virt.hardware [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1976.592627] env[68617]: DEBUG nova.virt.hardware [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1976.592853] env[68617]: DEBUG nova.virt.hardware [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1976.593138] env[68617]: DEBUG nova.virt.hardware [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1976.593355] env[68617]: DEBUG nova.virt.hardware [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1976.593569] env[68617]: DEBUG nova.virt.hardware [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1976.593867] env[68617]: DEBUG nova.virt.hardware [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1976.594109] env[68617]: DEBUG nova.virt.hardware [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1976.594359] env[68617]: DEBUG nova.virt.hardware [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1976.594609] env[68617]: DEBUG nova.virt.hardware [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1976.594863] env[68617]: DEBUG nova.virt.hardware [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1976.596100] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d2d1fa2-d99c-4934-8516-42a39da93267 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1976.607782] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbb4ef91-7ca2-4728-b6a9-b1fc63432ca7 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1977.025297] env[68617]: DEBUG nova.network.neutron [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Successfully created port: aa1e3a68-8079-4d79-b672-f1985c9cdf0c {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1977.587633] env[68617]: DEBUG nova.network.neutron [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Successfully updated port: aa1e3a68-8079-4d79-b672-f1985c9cdf0c {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1977.602566] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "refresh_cache-82f72313-f493-4acd-a95e-765feb74a358" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1977.602786] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquired lock "refresh_cache-82f72313-f493-4acd-a95e-765feb74a358" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1977.602952] env[68617]: DEBUG nova.network.neutron [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1977.645421] env[68617]: DEBUG nova.network.neutron [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1977.716592] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1977.813050] env[68617]: DEBUG nova.network.neutron [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Updating instance_info_cache with network_info: [{"id": "aa1e3a68-8079-4d79-b672-f1985c9cdf0c", "address": "fa:16:3e:29:88:d4", "network": {"id": "ad29e76d-388b-42ca-9526-7b6c236321e3", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1855301645-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4de7b27e9cf04c16b8dee80e756404fd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8e272539-d425-489f-9a63-aba692e88933", "external-id": "nsx-vlan-transportzone-869", "segmentation_id": 869, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapaa1e3a68-80", "ovs_interfaceid": "aa1e3a68-8079-4d79-b672-f1985c9cdf0c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1977.826507] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Releasing lock "refresh_cache-82f72313-f493-4acd-a95e-765feb74a358" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1977.826829] env[68617]: DEBUG nova.compute.manager [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Instance network_info: |[{"id": "aa1e3a68-8079-4d79-b672-f1985c9cdf0c", "address": "fa:16:3e:29:88:d4", "network": {"id": "ad29e76d-388b-42ca-9526-7b6c236321e3", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1855301645-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4de7b27e9cf04c16b8dee80e756404fd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8e272539-d425-489f-9a63-aba692e88933", "external-id": "nsx-vlan-transportzone-869", "segmentation_id": 869, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapaa1e3a68-80", "ovs_interfaceid": "aa1e3a68-8079-4d79-b672-f1985c9cdf0c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1977.827234] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:29:88:d4', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8e272539-d425-489f-9a63-aba692e88933', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'aa1e3a68-8079-4d79-b672-f1985c9cdf0c', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1977.834886] env[68617]: DEBUG oslo.service.loopingcall [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1977.835354] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1977.835579] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-00eec2eb-1e5c-472a-9259-3021c90cae1d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1977.856241] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1977.856241] env[68617]: value = "task-3470886" [ 1977.856241] env[68617]: _type = "Task" [ 1977.856241] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1977.864193] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470886, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1977.995880] env[68617]: DEBUG nova.compute.manager [req-d1d81d52-a57f-4003-8148-9933cf000bdb req-9caa3e82-141a-4a77-b2fd-e9b5e1f1ee38 service nova] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Received event network-vif-plugged-aa1e3a68-8079-4d79-b672-f1985c9cdf0c {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1977.996125] env[68617]: DEBUG oslo_concurrency.lockutils [req-d1d81d52-a57f-4003-8148-9933cf000bdb req-9caa3e82-141a-4a77-b2fd-e9b5e1f1ee38 service nova] Acquiring lock "82f72313-f493-4acd-a95e-765feb74a358-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1977.996410] env[68617]: DEBUG oslo_concurrency.lockutils [req-d1d81d52-a57f-4003-8148-9933cf000bdb req-9caa3e82-141a-4a77-b2fd-e9b5e1f1ee38 service nova] Lock "82f72313-f493-4acd-a95e-765feb74a358-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1977.996570] env[68617]: DEBUG oslo_concurrency.lockutils [req-d1d81d52-a57f-4003-8148-9933cf000bdb req-9caa3e82-141a-4a77-b2fd-e9b5e1f1ee38 service nova] Lock "82f72313-f493-4acd-a95e-765feb74a358-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1977.996735] env[68617]: DEBUG nova.compute.manager [req-d1d81d52-a57f-4003-8148-9933cf000bdb req-9caa3e82-141a-4a77-b2fd-e9b5e1f1ee38 service nova] [instance: 82f72313-f493-4acd-a95e-765feb74a358] No waiting events found dispatching network-vif-plugged-aa1e3a68-8079-4d79-b672-f1985c9cdf0c {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1977.996900] env[68617]: WARNING nova.compute.manager [req-d1d81d52-a57f-4003-8148-9933cf000bdb req-9caa3e82-141a-4a77-b2fd-e9b5e1f1ee38 service nova] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Received unexpected event network-vif-plugged-aa1e3a68-8079-4d79-b672-f1985c9cdf0c for instance with vm_state building and task_state spawning. [ 1977.997070] env[68617]: DEBUG nova.compute.manager [req-d1d81d52-a57f-4003-8148-9933cf000bdb req-9caa3e82-141a-4a77-b2fd-e9b5e1f1ee38 service nova] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Received event network-changed-aa1e3a68-8079-4d79-b672-f1985c9cdf0c {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1977.997230] env[68617]: DEBUG nova.compute.manager [req-d1d81d52-a57f-4003-8148-9933cf000bdb req-9caa3e82-141a-4a77-b2fd-e9b5e1f1ee38 service nova] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Refreshing instance network info cache due to event network-changed-aa1e3a68-8079-4d79-b672-f1985c9cdf0c. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1977.997416] env[68617]: DEBUG oslo_concurrency.lockutils [req-d1d81d52-a57f-4003-8148-9933cf000bdb req-9caa3e82-141a-4a77-b2fd-e9b5e1f1ee38 service nova] Acquiring lock "refresh_cache-82f72313-f493-4acd-a95e-765feb74a358" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1977.997550] env[68617]: DEBUG oslo_concurrency.lockutils [req-d1d81d52-a57f-4003-8148-9933cf000bdb req-9caa3e82-141a-4a77-b2fd-e9b5e1f1ee38 service nova] Acquired lock "refresh_cache-82f72313-f493-4acd-a95e-765feb74a358" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1977.997702] env[68617]: DEBUG nova.network.neutron [req-d1d81d52-a57f-4003-8148-9933cf000bdb req-9caa3e82-141a-4a77-b2fd-e9b5e1f1ee38 service nova] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Refreshing network info cache for port aa1e3a68-8079-4d79-b672-f1985c9cdf0c {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1978.238411] env[68617]: DEBUG nova.network.neutron [req-d1d81d52-a57f-4003-8148-9933cf000bdb req-9caa3e82-141a-4a77-b2fd-e9b5e1f1ee38 service nova] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Updated VIF entry in instance network info cache for port aa1e3a68-8079-4d79-b672-f1985c9cdf0c. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1978.238796] env[68617]: DEBUG nova.network.neutron [req-d1d81d52-a57f-4003-8148-9933cf000bdb req-9caa3e82-141a-4a77-b2fd-e9b5e1f1ee38 service nova] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Updating instance_info_cache with network_info: [{"id": "aa1e3a68-8079-4d79-b672-f1985c9cdf0c", "address": "fa:16:3e:29:88:d4", "network": {"id": "ad29e76d-388b-42ca-9526-7b6c236321e3", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1855301645-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4de7b27e9cf04c16b8dee80e756404fd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8e272539-d425-489f-9a63-aba692e88933", "external-id": "nsx-vlan-transportzone-869", "segmentation_id": 869, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapaa1e3a68-80", "ovs_interfaceid": "aa1e3a68-8079-4d79-b672-f1985c9cdf0c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1978.247868] env[68617]: DEBUG oslo_concurrency.lockutils [req-d1d81d52-a57f-4003-8148-9933cf000bdb req-9caa3e82-141a-4a77-b2fd-e9b5e1f1ee38 service nova] Releasing lock "refresh_cache-82f72313-f493-4acd-a95e-765feb74a358" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1978.367851] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470886, 'name': CreateVM_Task, 'duration_secs': 0.290623} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1978.368034] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1978.374797] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1978.374987] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1978.375287] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1978.375522] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e42d788b-492d-4e94-95cb-70cc5aff1bf4 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1978.379765] env[68617]: DEBUG oslo_vmware.api [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Waiting for the task: (returnval){ [ 1978.379765] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52e9e388-31d8-62ef-41c9-f7fd1b7e3543" [ 1978.379765] env[68617]: _type = "Task" [ 1978.379765] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1978.386866] env[68617]: DEBUG oslo_vmware.api [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52e9e388-31d8-62ef-41c9-f7fd1b7e3543, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1978.891050] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1978.891373] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1978.891536] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1979.698709] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1979.698899] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1979.699056] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1979.699208] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68617) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1980.700449] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1982.699158] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1983.699559] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1983.713706] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1983.713706] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1983.713706] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1983.713706] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1983.714840] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81f33b7f-ef30-4f5d-bfe2-82c0e71c7d00 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1983.724431] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-016809e5-f244-4d6c-9472-42f95cb2d351 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1983.738916] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-513ce5d8-714d-4026-ac97-7759ae975b55 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1983.745728] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6db0801-7f28-4cda-9434-74ead63a5ca8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1983.775879] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180933MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1983.776081] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1983.776231] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1983.853674] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance f54002b0-d60e-44ff-82a5-ef2f5193c48c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1983.853859] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 2c950cba-7698-48e0-8852-bf569f58f967 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1983.853986] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 12ed2a40-3d74-49a2-95b4-ccaaf58c8060 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1983.854166] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 21d0560a-fde3-4c16-b2fc-06d6f8668a7a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1983.854243] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 902b5ab9-23b8-450f-853a-b2da889c3afd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1983.854356] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 922c8926-c636-4463-85d6-4f2a6325b85a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1983.854476] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b1a8dc60-af98-4f80-96cf-b2550ea8c13a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1983.854592] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance a4ab788d-327a-47cc-8ae7-e1b9be889759 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1983.854704] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 17bb8415-dafd-47ed-9a14-52163ba5e7db actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1983.854815] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 82f72313-f493-4acd-a95e-765feb74a358 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1983.866923] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 797b434e-a913-43dc-a1df-39fe82da1221 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1983.867140] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1983.867276] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1984.032402] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-599b9246-0f5c-42dc-a9d6-52d959b01956 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1984.040113] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9e19dde-0816-432b-ab8a-8d78e855a458 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1984.069565] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0266cd74-110f-448b-886c-7b61f1f84085 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1984.076883] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4edcf0b-a9e7-4689-8e59-0b0cc2a83d5a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1984.089681] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1984.097862] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1984.113038] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1984.113225] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.337s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1986.107538] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1986.107866] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2022.584963] env[68617]: WARNING oslo_vmware.rw_handles [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2022.584963] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2022.584963] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2022.584963] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2022.584963] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2022.584963] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 2022.584963] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2022.584963] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2022.584963] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2022.584963] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2022.584963] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2022.584963] env[68617]: ERROR oslo_vmware.rw_handles [ 2022.585786] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/9eaac455-b622-4d28-9e3a-13d79c482e6c/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2022.587375] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2022.587659] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Copying Virtual Disk [datastore2] vmware_temp/9eaac455-b622-4d28-9e3a-13d79c482e6c/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/9eaac455-b622-4d28-9e3a-13d79c482e6c/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2022.587983] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a6845119-f549-469d-9aa6-b57d2683b30e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2022.596930] env[68617]: DEBUG oslo_vmware.api [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Waiting for the task: (returnval){ [ 2022.596930] env[68617]: value = "task-3470887" [ 2022.596930] env[68617]: _type = "Task" [ 2022.596930] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2022.606128] env[68617]: DEBUG oslo_vmware.api [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Task: {'id': task-3470887, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2023.107046] env[68617]: DEBUG oslo_vmware.exceptions [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2023.107294] env[68617]: DEBUG oslo_concurrency.lockutils [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2023.107886] env[68617]: ERROR nova.compute.manager [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2023.107886] env[68617]: Faults: ['InvalidArgument'] [ 2023.107886] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Traceback (most recent call last): [ 2023.107886] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2023.107886] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] yield resources [ 2023.107886] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2023.107886] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] self.driver.spawn(context, instance, image_meta, [ 2023.107886] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2023.107886] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2023.107886] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2023.107886] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] self._fetch_image_if_missing(context, vi) [ 2023.107886] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2023.108356] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] image_cache(vi, tmp_image_ds_loc) [ 2023.108356] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2023.108356] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] vm_util.copy_virtual_disk( [ 2023.108356] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2023.108356] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] session._wait_for_task(vmdk_copy_task) [ 2023.108356] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2023.108356] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] return self.wait_for_task(task_ref) [ 2023.108356] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2023.108356] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] return evt.wait() [ 2023.108356] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2023.108356] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] result = hub.switch() [ 2023.108356] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2023.108356] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] return self.greenlet.switch() [ 2023.108825] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2023.108825] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] self.f(*self.args, **self.kw) [ 2023.108825] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2023.108825] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] raise exceptions.translate_fault(task_info.error) [ 2023.108825] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2023.108825] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Faults: ['InvalidArgument'] [ 2023.108825] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] [ 2023.108825] env[68617]: INFO nova.compute.manager [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Terminating instance [ 2023.109814] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2023.110033] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2023.110273] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8afe6211-5cf5-46e7-afaf-61c8d61c0d25 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2023.112585] env[68617]: DEBUG nova.compute.manager [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2023.112785] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2023.113502] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e17f6e1e-9145-4920-9073-3c1de9981c9d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2023.120124] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2023.120331] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-210e80cb-2c6c-4907-a614-e3431cb09476 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2023.122448] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2023.122615] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2023.123551] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9657203f-92a5-4ca3-aecb-b35a9b65139f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2023.128165] env[68617]: DEBUG oslo_vmware.api [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Waiting for the task: (returnval){ [ 2023.128165] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52d914e6-27f8-73d8-b7c1-1daaffe8c717" [ 2023.128165] env[68617]: _type = "Task" [ 2023.128165] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2023.135164] env[68617]: DEBUG oslo_vmware.api [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52d914e6-27f8-73d8-b7c1-1daaffe8c717, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2023.187317] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2023.187510] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2023.187687] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Deleting the datastore file [datastore2] f54002b0-d60e-44ff-82a5-ef2f5193c48c {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2023.187948] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-51934212-41a1-408c-9f92-4a689945244d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2023.194249] env[68617]: DEBUG oslo_vmware.api [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Waiting for the task: (returnval){ [ 2023.194249] env[68617]: value = "task-3470889" [ 2023.194249] env[68617]: _type = "Task" [ 2023.194249] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2023.201909] env[68617]: DEBUG oslo_vmware.api [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Task: {'id': task-3470889, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2023.638970] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2023.639323] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Creating directory with path [datastore2] vmware_temp/0e656d53-7992-4d58-b869-6df0776ae482/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2023.639439] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5955e622-8582-4888-9be6-f2b9f87d8ae0 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2023.650866] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Created directory with path [datastore2] vmware_temp/0e656d53-7992-4d58-b869-6df0776ae482/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2023.651114] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Fetch image to [datastore2] vmware_temp/0e656d53-7992-4d58-b869-6df0776ae482/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2023.651310] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/0e656d53-7992-4d58-b869-6df0776ae482/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2023.652068] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7c8718d-d64a-408e-aae4-1dea589526de {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2023.659282] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f81f166c-8033-466b-b706-b72eee575f20 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2023.668666] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8ae4fdc-fd0b-4116-93e0-9e15e098e674 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2023.702857] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-231a7762-b082-4bbd-b851-6ba9d345db7c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2023.710831] env[68617]: DEBUG oslo_vmware.api [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Task: {'id': task-3470889, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.06943} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2023.712275] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2023.712468] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2023.712635] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2023.712810] env[68617]: INFO nova.compute.manager [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2023.714571] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-6cb58c87-7e9c-4cca-a4e8-f85d0b20b1ab {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2023.716441] env[68617]: DEBUG nova.compute.claims [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2023.716626] env[68617]: DEBUG oslo_concurrency.lockutils [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2023.716884] env[68617]: DEBUG oslo_concurrency.lockutils [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2023.738897] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2023.791371] env[68617]: DEBUG oslo_vmware.rw_handles [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0e656d53-7992-4d58-b869-6df0776ae482/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2023.850485] env[68617]: DEBUG oslo_vmware.rw_handles [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2023.850603] env[68617]: DEBUG oslo_vmware.rw_handles [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0e656d53-7992-4d58-b869-6df0776ae482/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2023.953674] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e9e124b-7092-4b77-85c0-e9bff8583c70 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2023.961140] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-735d677e-44f1-4de2-8fbc-88b2f996ea21 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2023.990833] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebff566a-9c0f-4147-a1ec-391589e4874f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2023.998086] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-917ef770-808e-4df6-bc47-4c4b17d7c7fb {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2024.011010] env[68617]: DEBUG nova.compute.provider_tree [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2024.019021] env[68617]: DEBUG nova.scheduler.client.report [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2024.032921] env[68617]: DEBUG oslo_concurrency.lockutils [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.316s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2024.033491] env[68617]: ERROR nova.compute.manager [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2024.033491] env[68617]: Faults: ['InvalidArgument'] [ 2024.033491] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Traceback (most recent call last): [ 2024.033491] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2024.033491] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] self.driver.spawn(context, instance, image_meta, [ 2024.033491] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2024.033491] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2024.033491] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2024.033491] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] self._fetch_image_if_missing(context, vi) [ 2024.033491] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2024.033491] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] image_cache(vi, tmp_image_ds_loc) [ 2024.033491] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2024.034206] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] vm_util.copy_virtual_disk( [ 2024.034206] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2024.034206] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] session._wait_for_task(vmdk_copy_task) [ 2024.034206] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2024.034206] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] return self.wait_for_task(task_ref) [ 2024.034206] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2024.034206] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] return evt.wait() [ 2024.034206] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2024.034206] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] result = hub.switch() [ 2024.034206] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2024.034206] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] return self.greenlet.switch() [ 2024.034206] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2024.034206] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] self.f(*self.args, **self.kw) [ 2024.034746] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2024.034746] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] raise exceptions.translate_fault(task_info.error) [ 2024.034746] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2024.034746] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Faults: ['InvalidArgument'] [ 2024.034746] env[68617]: ERROR nova.compute.manager [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] [ 2024.034746] env[68617]: DEBUG nova.compute.utils [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2024.036009] env[68617]: DEBUG nova.compute.manager [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Build of instance f54002b0-d60e-44ff-82a5-ef2f5193c48c was re-scheduled: A specified parameter was not correct: fileType [ 2024.036009] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2024.036397] env[68617]: DEBUG nova.compute.manager [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2024.036578] env[68617]: DEBUG nova.compute.manager [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2024.036780] env[68617]: DEBUG nova.compute.manager [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2024.036944] env[68617]: DEBUG nova.network.neutron [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2024.340060] env[68617]: DEBUG nova.network.neutron [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2024.350598] env[68617]: INFO nova.compute.manager [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Took 0.31 seconds to deallocate network for instance. [ 2024.441863] env[68617]: INFO nova.scheduler.client.report [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Deleted allocations for instance f54002b0-d60e-44ff-82a5-ef2f5193c48c [ 2024.464829] env[68617]: DEBUG oslo_concurrency.lockutils [None req-82de6415-44b9-4c52-b659-9bcd742923f2 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "f54002b0-d60e-44ff-82a5-ef2f5193c48c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 485.145s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2024.465996] env[68617]: DEBUG oslo_concurrency.lockutils [None req-55c4f4bc-0d05-4fce-9b83-cc0321f57d66 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "f54002b0-d60e-44ff-82a5-ef2f5193c48c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 289.079s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2024.466235] env[68617]: DEBUG oslo_concurrency.lockutils [None req-55c4f4bc-0d05-4fce-9b83-cc0321f57d66 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "f54002b0-d60e-44ff-82a5-ef2f5193c48c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2024.466446] env[68617]: DEBUG oslo_concurrency.lockutils [None req-55c4f4bc-0d05-4fce-9b83-cc0321f57d66 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "f54002b0-d60e-44ff-82a5-ef2f5193c48c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2024.466633] env[68617]: DEBUG oslo_concurrency.lockutils [None req-55c4f4bc-0d05-4fce-9b83-cc0321f57d66 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "f54002b0-d60e-44ff-82a5-ef2f5193c48c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2024.468603] env[68617]: INFO nova.compute.manager [None req-55c4f4bc-0d05-4fce-9b83-cc0321f57d66 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Terminating instance [ 2024.470425] env[68617]: DEBUG nova.compute.manager [None req-55c4f4bc-0d05-4fce-9b83-cc0321f57d66 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2024.470646] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-55c4f4bc-0d05-4fce-9b83-cc0321f57d66 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2024.471149] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-92b765ef-1c2c-40cf-895c-37bbece782d0 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2024.481287] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cba349fc-fce6-4392-af66-fae134baa03e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2024.491895] env[68617]: DEBUG nova.compute.manager [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2024.515020] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-55c4f4bc-0d05-4fce-9b83-cc0321f57d66 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f54002b0-d60e-44ff-82a5-ef2f5193c48c could not be found. [ 2024.515020] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-55c4f4bc-0d05-4fce-9b83-cc0321f57d66 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2024.515020] env[68617]: INFO nova.compute.manager [None req-55c4f4bc-0d05-4fce-9b83-cc0321f57d66 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2024.515020] env[68617]: DEBUG oslo.service.loopingcall [None req-55c4f4bc-0d05-4fce-9b83-cc0321f57d66 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2024.515020] env[68617]: DEBUG nova.compute.manager [-] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2024.515316] env[68617]: DEBUG nova.network.neutron [-] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2024.541995] env[68617]: DEBUG nova.network.neutron [-] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2024.544693] env[68617]: DEBUG oslo_concurrency.lockutils [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2024.544952] env[68617]: DEBUG oslo_concurrency.lockutils [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2024.546610] env[68617]: INFO nova.compute.claims [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2024.550192] env[68617]: INFO nova.compute.manager [-] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] Took 0.04 seconds to deallocate network for instance. [ 2024.656832] env[68617]: DEBUG oslo_concurrency.lockutils [None req-55c4f4bc-0d05-4fce-9b83-cc0321f57d66 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "f54002b0-d60e-44ff-82a5-ef2f5193c48c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.191s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2024.658571] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "f54002b0-d60e-44ff-82a5-ef2f5193c48c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 121.009s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2024.658571] env[68617]: INFO nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: f54002b0-d60e-44ff-82a5-ef2f5193c48c] During sync_power_state the instance has a pending task (deleting). Skip. [ 2024.658571] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "f54002b0-d60e-44ff-82a5-ef2f5193c48c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2024.763513] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfa0f72c-3181-4438-9ab7-1320064e5d45 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2024.771719] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4763f1f-0d32-4a9d-93d5-53317f372842 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2024.802050] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d70cc0f-2e22-4888-a982-ff439f7ce481 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2024.808910] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db60f300-05e9-4ce5-9d90-e456b1b3418a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2024.821570] env[68617]: DEBUG nova.compute.provider_tree [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2024.830143] env[68617]: DEBUG nova.scheduler.client.report [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2024.843554] env[68617]: DEBUG oslo_concurrency.lockutils [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.299s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2024.843995] env[68617]: DEBUG nova.compute.manager [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2024.876013] env[68617]: DEBUG nova.compute.utils [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2024.877192] env[68617]: DEBUG nova.compute.manager [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2024.877368] env[68617]: DEBUG nova.network.neutron [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2024.885445] env[68617]: DEBUG nova.compute.manager [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2024.934288] env[68617]: DEBUG nova.policy [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '546f17dfba284c76b4ff2dde1a09928a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '162ecdbf203345a5b63167459e388608', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 2024.946036] env[68617]: DEBUG nova.compute.manager [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2024.971902] env[68617]: DEBUG nova.virt.hardware [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2024.972147] env[68617]: DEBUG nova.virt.hardware [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2024.972305] env[68617]: DEBUG nova.virt.hardware [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2024.972485] env[68617]: DEBUG nova.virt.hardware [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2024.972630] env[68617]: DEBUG nova.virt.hardware [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2024.972821] env[68617]: DEBUG nova.virt.hardware [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2024.973126] env[68617]: DEBUG nova.virt.hardware [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2024.973351] env[68617]: DEBUG nova.virt.hardware [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2024.973569] env[68617]: DEBUG nova.virt.hardware [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2024.973787] env[68617]: DEBUG nova.virt.hardware [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2024.974026] env[68617]: DEBUG nova.virt.hardware [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2024.975110] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4411bed-93ff-4cb1-a9b9-7e1a00e9cded {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2024.983268] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ccb21bc9-b75b-456a-8e38-032b8c9b2630 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2025.223806] env[68617]: DEBUG nova.network.neutron [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Successfully created port: 4ea83c91-ca66-4c65-ac32-5a088c5cb80d {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2025.768962] env[68617]: DEBUG nova.network.neutron [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Successfully updated port: 4ea83c91-ca66-4c65-ac32-5a088c5cb80d {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2025.779841] env[68617]: DEBUG oslo_concurrency.lockutils [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquiring lock "refresh_cache-797b434e-a913-43dc-a1df-39fe82da1221" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2025.782019] env[68617]: DEBUG oslo_concurrency.lockutils [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquired lock "refresh_cache-797b434e-a913-43dc-a1df-39fe82da1221" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2025.782019] env[68617]: DEBUG nova.network.neutron [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2025.818832] env[68617]: DEBUG nova.network.neutron [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2025.977451] env[68617]: DEBUG nova.network.neutron [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Updating instance_info_cache with network_info: [{"id": "4ea83c91-ca66-4c65-ac32-5a088c5cb80d", "address": "fa:16:3e:a0:a5:2f", "network": {"id": "e6650a9f-f26d-481d-8658-10ff40328891", "bridge": "br-int", "label": "tempest-ServersTestJSON-1149134727-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "162ecdbf203345a5b63167459e388608", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "aa09e855-8af1-419b-b78d-8ffcc94b1bfb", "external-id": "nsx-vlan-transportzone-901", "segmentation_id": 901, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4ea83c91-ca", "ovs_interfaceid": "4ea83c91-ca66-4c65-ac32-5a088c5cb80d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2025.990202] env[68617]: DEBUG oslo_concurrency.lockutils [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Releasing lock "refresh_cache-797b434e-a913-43dc-a1df-39fe82da1221" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2025.990505] env[68617]: DEBUG nova.compute.manager [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Instance network_info: |[{"id": "4ea83c91-ca66-4c65-ac32-5a088c5cb80d", "address": "fa:16:3e:a0:a5:2f", "network": {"id": "e6650a9f-f26d-481d-8658-10ff40328891", "bridge": "br-int", "label": "tempest-ServersTestJSON-1149134727-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "162ecdbf203345a5b63167459e388608", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "aa09e855-8af1-419b-b78d-8ffcc94b1bfb", "external-id": "nsx-vlan-transportzone-901", "segmentation_id": 901, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4ea83c91-ca", "ovs_interfaceid": "4ea83c91-ca66-4c65-ac32-5a088c5cb80d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2025.990903] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a0:a5:2f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'aa09e855-8af1-419b-b78d-8ffcc94b1bfb', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '4ea83c91-ca66-4c65-ac32-5a088c5cb80d', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2025.998405] env[68617]: DEBUG oslo.service.loopingcall [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2025.998893] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2025.999131] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f3066e71-13c2-4eae-aa9b-96f8af6b8ca1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2026.019478] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2026.019478] env[68617]: value = "task-3470890" [ 2026.019478] env[68617]: _type = "Task" [ 2026.019478] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2026.027185] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470890, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2026.399662] env[68617]: DEBUG nova.compute.manager [req-01135a6d-c05b-42c9-9a04-5652c760f5a8 req-827adb6a-72e5-47a5-aac3-5096a49ac98e service nova] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Received event network-vif-plugged-4ea83c91-ca66-4c65-ac32-5a088c5cb80d {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2026.399908] env[68617]: DEBUG oslo_concurrency.lockutils [req-01135a6d-c05b-42c9-9a04-5652c760f5a8 req-827adb6a-72e5-47a5-aac3-5096a49ac98e service nova] Acquiring lock "797b434e-a913-43dc-a1df-39fe82da1221-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2026.400133] env[68617]: DEBUG oslo_concurrency.lockutils [req-01135a6d-c05b-42c9-9a04-5652c760f5a8 req-827adb6a-72e5-47a5-aac3-5096a49ac98e service nova] Lock "797b434e-a913-43dc-a1df-39fe82da1221-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2026.400304] env[68617]: DEBUG oslo_concurrency.lockutils [req-01135a6d-c05b-42c9-9a04-5652c760f5a8 req-827adb6a-72e5-47a5-aac3-5096a49ac98e service nova] Lock "797b434e-a913-43dc-a1df-39fe82da1221-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2026.400473] env[68617]: DEBUG nova.compute.manager [req-01135a6d-c05b-42c9-9a04-5652c760f5a8 req-827adb6a-72e5-47a5-aac3-5096a49ac98e service nova] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] No waiting events found dispatching network-vif-plugged-4ea83c91-ca66-4c65-ac32-5a088c5cb80d {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2026.400634] env[68617]: WARNING nova.compute.manager [req-01135a6d-c05b-42c9-9a04-5652c760f5a8 req-827adb6a-72e5-47a5-aac3-5096a49ac98e service nova] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Received unexpected event network-vif-plugged-4ea83c91-ca66-4c65-ac32-5a088c5cb80d for instance with vm_state building and task_state spawning. [ 2026.400791] env[68617]: DEBUG nova.compute.manager [req-01135a6d-c05b-42c9-9a04-5652c760f5a8 req-827adb6a-72e5-47a5-aac3-5096a49ac98e service nova] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Received event network-changed-4ea83c91-ca66-4c65-ac32-5a088c5cb80d {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2026.400945] env[68617]: DEBUG nova.compute.manager [req-01135a6d-c05b-42c9-9a04-5652c760f5a8 req-827adb6a-72e5-47a5-aac3-5096a49ac98e service nova] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Refreshing instance network info cache due to event network-changed-4ea83c91-ca66-4c65-ac32-5a088c5cb80d. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2026.401145] env[68617]: DEBUG oslo_concurrency.lockutils [req-01135a6d-c05b-42c9-9a04-5652c760f5a8 req-827adb6a-72e5-47a5-aac3-5096a49ac98e service nova] Acquiring lock "refresh_cache-797b434e-a913-43dc-a1df-39fe82da1221" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2026.401279] env[68617]: DEBUG oslo_concurrency.lockutils [req-01135a6d-c05b-42c9-9a04-5652c760f5a8 req-827adb6a-72e5-47a5-aac3-5096a49ac98e service nova] Acquired lock "refresh_cache-797b434e-a913-43dc-a1df-39fe82da1221" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2026.401445] env[68617]: DEBUG nova.network.neutron [req-01135a6d-c05b-42c9-9a04-5652c760f5a8 req-827adb6a-72e5-47a5-aac3-5096a49ac98e service nova] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Refreshing network info cache for port 4ea83c91-ca66-4c65-ac32-5a088c5cb80d {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2026.528845] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470890, 'name': CreateVM_Task, 'duration_secs': 0.343672} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2026.529069] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2026.529702] env[68617]: DEBUG oslo_concurrency.lockutils [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2026.529904] env[68617]: DEBUG oslo_concurrency.lockutils [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2026.530246] env[68617]: DEBUG oslo_concurrency.lockutils [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2026.530497] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d0c3e6c5-032c-4310-98d6-79145f6e846b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2026.534795] env[68617]: DEBUG oslo_vmware.api [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Waiting for the task: (returnval){ [ 2026.534795] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]5291fe92-f87d-7571-ade1-62b6703b556d" [ 2026.534795] env[68617]: _type = "Task" [ 2026.534795] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2026.543245] env[68617]: DEBUG oslo_vmware.api [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]5291fe92-f87d-7571-ade1-62b6703b556d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2026.787323] env[68617]: DEBUG nova.network.neutron [req-01135a6d-c05b-42c9-9a04-5652c760f5a8 req-827adb6a-72e5-47a5-aac3-5096a49ac98e service nova] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Updated VIF entry in instance network info cache for port 4ea83c91-ca66-4c65-ac32-5a088c5cb80d. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2026.787685] env[68617]: DEBUG nova.network.neutron [req-01135a6d-c05b-42c9-9a04-5652c760f5a8 req-827adb6a-72e5-47a5-aac3-5096a49ac98e service nova] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Updating instance_info_cache with network_info: [{"id": "4ea83c91-ca66-4c65-ac32-5a088c5cb80d", "address": "fa:16:3e:a0:a5:2f", "network": {"id": "e6650a9f-f26d-481d-8658-10ff40328891", "bridge": "br-int", "label": "tempest-ServersTestJSON-1149134727-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "162ecdbf203345a5b63167459e388608", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "aa09e855-8af1-419b-b78d-8ffcc94b1bfb", "external-id": "nsx-vlan-transportzone-901", "segmentation_id": 901, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4ea83c91-ca", "ovs_interfaceid": "4ea83c91-ca66-4c65-ac32-5a088c5cb80d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2026.796515] env[68617]: DEBUG oslo_concurrency.lockutils [req-01135a6d-c05b-42c9-9a04-5652c760f5a8 req-827adb6a-72e5-47a5-aac3-5096a49ac98e service nova] Releasing lock "refresh_cache-797b434e-a913-43dc-a1df-39fe82da1221" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2027.045101] env[68617]: DEBUG oslo_concurrency.lockutils [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2027.045363] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2027.045579] env[68617]: DEBUG oslo_concurrency.lockutils [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2036.699547] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2036.699933] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Starting heal instance info cache {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2036.699933] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Rebuilding the list of instances to heal {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2036.721905] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2036.722105] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2036.722192] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2036.722316] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2036.722438] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2036.722559] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2036.722681] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2036.722802] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2036.722923] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2036.723054] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2036.723180] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Didn't find any instances for network info cache update. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2040.699139] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2040.699558] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2041.699608] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2041.700041] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2041.700041] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68617) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2043.701613] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2044.698694] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2044.712072] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2044.712397] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2044.712508] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2044.712622] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2044.713689] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-413d3632-be95-421e-bfe0-1bfa75e63396 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2044.722538] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab4d5f8a-3499-4905-98eb-13678bf1a863 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2044.736190] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6c747d2-a94f-4e1c-ac5e-ac7ba709e236 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2044.742203] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f336e74-1484-4e42-b9ed-523281031368 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2044.771854] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180901MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2044.772030] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2044.772197] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2044.843622] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 2c950cba-7698-48e0-8852-bf569f58f967 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.843786] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 12ed2a40-3d74-49a2-95b4-ccaaf58c8060 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.843916] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 21d0560a-fde3-4c16-b2fc-06d6f8668a7a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.844056] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 902b5ab9-23b8-450f-853a-b2da889c3afd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.844180] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 922c8926-c636-4463-85d6-4f2a6325b85a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.844300] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b1a8dc60-af98-4f80-96cf-b2550ea8c13a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.844417] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance a4ab788d-327a-47cc-8ae7-e1b9be889759 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.844535] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 17bb8415-dafd-47ed-9a14-52163ba5e7db actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.844651] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 82f72313-f493-4acd-a95e-765feb74a358 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.844766] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 797b434e-a913-43dc-a1df-39fe82da1221 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.844954] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2044.845109] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2044.952505] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d688491-9f49-417a-9dbb-f3872162ace8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2044.960498] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ec404c2-d17e-4ffa-bc01-ff7f9a864fe1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2044.988808] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1caa4ec9-569f-48fc-ba67-1f5eaf557854 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2044.995435] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9761524c-ae33-449d-b02a-2d328fe426ac {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2045.007834] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2045.015838] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2045.029061] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2045.029302] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.257s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2046.030120] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2047.694697] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2070.104045] env[68617]: WARNING oslo_vmware.rw_handles [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2070.104045] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2070.104045] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2070.104045] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2070.104045] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2070.104045] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 2070.104045] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2070.104045] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2070.104045] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2070.104045] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2070.104045] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2070.104045] env[68617]: ERROR oslo_vmware.rw_handles [ 2070.104775] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/0e656d53-7992-4d58-b869-6df0776ae482/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2070.106495] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2070.106739] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Copying Virtual Disk [datastore2] vmware_temp/0e656d53-7992-4d58-b869-6df0776ae482/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/0e656d53-7992-4d58-b869-6df0776ae482/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2070.107029] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4ea04b3a-75f0-4751-a6c2-d8bdf9fb3c99 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2070.114437] env[68617]: DEBUG oslo_vmware.api [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Waiting for the task: (returnval){ [ 2070.114437] env[68617]: value = "task-3470891" [ 2070.114437] env[68617]: _type = "Task" [ 2070.114437] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2070.121946] env[68617]: DEBUG oslo_vmware.api [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Task: {'id': task-3470891, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2070.624328] env[68617]: DEBUG oslo_vmware.exceptions [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2070.624602] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2070.625166] env[68617]: ERROR nova.compute.manager [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2070.625166] env[68617]: Faults: ['InvalidArgument'] [ 2070.625166] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Traceback (most recent call last): [ 2070.625166] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2070.625166] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] yield resources [ 2070.625166] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2070.625166] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] self.driver.spawn(context, instance, image_meta, [ 2070.625166] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2070.625166] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2070.625166] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2070.625166] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] self._fetch_image_if_missing(context, vi) [ 2070.625166] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2070.625449] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] image_cache(vi, tmp_image_ds_loc) [ 2070.625449] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2070.625449] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] vm_util.copy_virtual_disk( [ 2070.625449] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2070.625449] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] session._wait_for_task(vmdk_copy_task) [ 2070.625449] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2070.625449] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] return self.wait_for_task(task_ref) [ 2070.625449] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2070.625449] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] return evt.wait() [ 2070.625449] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2070.625449] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] result = hub.switch() [ 2070.625449] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2070.625449] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] return self.greenlet.switch() [ 2070.625715] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2070.625715] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] self.f(*self.args, **self.kw) [ 2070.625715] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2070.625715] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] raise exceptions.translate_fault(task_info.error) [ 2070.625715] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2070.625715] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Faults: ['InvalidArgument'] [ 2070.625715] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] [ 2070.625715] env[68617]: INFO nova.compute.manager [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Terminating instance [ 2070.627072] env[68617]: DEBUG oslo_concurrency.lockutils [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2070.627305] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2070.627548] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c20f6066-b119-4a7f-8231-5f19928962c8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2070.629846] env[68617]: DEBUG nova.compute.manager [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2070.630046] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2070.630780] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cfe7593-7e5b-4c8f-bf24-21f3856473b4 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2070.637339] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2070.637548] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b8759af7-16e8-4e0f-9691-7f5af8e6f52e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2070.639817] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2070.639985] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2070.640940] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-27c48962-21a9-47c3-9889-61468a38c707 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2070.645357] env[68617]: DEBUG oslo_vmware.api [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Waiting for the task: (returnval){ [ 2070.645357] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52e5449f-689f-ecc3-103e-a8bf019e5bf3" [ 2070.645357] env[68617]: _type = "Task" [ 2070.645357] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2070.653337] env[68617]: DEBUG oslo_vmware.api [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52e5449f-689f-ecc3-103e-a8bf019e5bf3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2070.704055] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2070.704306] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2070.704483] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Deleting the datastore file [datastore2] 2c950cba-7698-48e0-8852-bf569f58f967 {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2070.704810] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-adec5aa0-a9bd-4559-a904-4b919fa96bea {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2070.711164] env[68617]: DEBUG oslo_vmware.api [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Waiting for the task: (returnval){ [ 2070.711164] env[68617]: value = "task-3470893" [ 2070.711164] env[68617]: _type = "Task" [ 2070.711164] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2070.718497] env[68617]: DEBUG oslo_vmware.api [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Task: {'id': task-3470893, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2071.155021] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2071.155442] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Creating directory with path [datastore2] vmware_temp/199cd782-0cf4-43b1-8923-6006dc07a64d/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2071.155680] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7c7b1c50-ae9b-4fca-81fb-62bdea1a4a4f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2071.166401] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Created directory with path [datastore2] vmware_temp/199cd782-0cf4-43b1-8923-6006dc07a64d/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2071.166585] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Fetch image to [datastore2] vmware_temp/199cd782-0cf4-43b1-8923-6006dc07a64d/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2071.166752] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/199cd782-0cf4-43b1-8923-6006dc07a64d/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2071.167466] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ad033e4-266a-4f2e-a1f4-1bd5646280ae {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2071.173693] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b7d0d7c-1f7e-4163-bb35-45d05c50404d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2071.182486] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2cd513f2-ca1b-4ef7-a878-c6b6d0f76c00 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2071.216541] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e2295c9-3020-4184-aa43-387861b21701 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2071.222900] env[68617]: DEBUG oslo_vmware.api [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Task: {'id': task-3470893, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074287} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2071.224257] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2071.224442] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2071.224611] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2071.224780] env[68617]: INFO nova.compute.manager [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Took 0.59 seconds to destroy the instance on the hypervisor. [ 2071.226606] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-662cceb0-1d2b-44c1-9082-e6e40b498819 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2071.228447] env[68617]: DEBUG nova.compute.claims [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2071.228622] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2071.228830] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2071.250029] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2071.338152] env[68617]: DEBUG oslo_vmware.rw_handles [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/199cd782-0cf4-43b1-8923-6006dc07a64d/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2071.397198] env[68617]: DEBUG oslo_vmware.rw_handles [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2071.397412] env[68617]: DEBUG oslo_vmware.rw_handles [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/199cd782-0cf4-43b1-8923-6006dc07a64d/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2071.436666] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3dcc0ed1-08d0-4997-b5f1-7aff11b9e14c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2071.444823] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acf7c508-33ec-4740-8303-cb1ca306c8c2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2071.473135] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e92f1ccb-ad0b-4e99-bc7a-78d42c864a75 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2071.479981] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2a636de-0461-4206-abc6-8829aa46d3e8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2071.492409] env[68617]: DEBUG nova.compute.provider_tree [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2071.501146] env[68617]: DEBUG nova.scheduler.client.report [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2071.514714] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.286s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2071.515249] env[68617]: ERROR nova.compute.manager [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2071.515249] env[68617]: Faults: ['InvalidArgument'] [ 2071.515249] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Traceback (most recent call last): [ 2071.515249] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2071.515249] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] self.driver.spawn(context, instance, image_meta, [ 2071.515249] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2071.515249] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2071.515249] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2071.515249] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] self._fetch_image_if_missing(context, vi) [ 2071.515249] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2071.515249] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] image_cache(vi, tmp_image_ds_loc) [ 2071.515249] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2071.515633] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] vm_util.copy_virtual_disk( [ 2071.515633] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2071.515633] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] session._wait_for_task(vmdk_copy_task) [ 2071.515633] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2071.515633] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] return self.wait_for_task(task_ref) [ 2071.515633] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2071.515633] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] return evt.wait() [ 2071.515633] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2071.515633] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] result = hub.switch() [ 2071.515633] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2071.515633] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] return self.greenlet.switch() [ 2071.515633] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2071.515633] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] self.f(*self.args, **self.kw) [ 2071.515978] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2071.515978] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] raise exceptions.translate_fault(task_info.error) [ 2071.515978] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2071.515978] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Faults: ['InvalidArgument'] [ 2071.515978] env[68617]: ERROR nova.compute.manager [instance: 2c950cba-7698-48e0-8852-bf569f58f967] [ 2071.515978] env[68617]: DEBUG nova.compute.utils [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2071.517685] env[68617]: DEBUG nova.compute.manager [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Build of instance 2c950cba-7698-48e0-8852-bf569f58f967 was re-scheduled: A specified parameter was not correct: fileType [ 2071.517685] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2071.518088] env[68617]: DEBUG nova.compute.manager [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2071.518304] env[68617]: DEBUG nova.compute.manager [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2071.518512] env[68617]: DEBUG nova.compute.manager [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2071.518690] env[68617]: DEBUG nova.network.neutron [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2071.808917] env[68617]: DEBUG nova.network.neutron [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2071.818891] env[68617]: INFO nova.compute.manager [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Took 0.30 seconds to deallocate network for instance. [ 2071.908790] env[68617]: INFO nova.scheduler.client.report [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Deleted allocations for instance 2c950cba-7698-48e0-8852-bf569f58f967 [ 2071.937269] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6e96ec8d-6cb6-40b1-8ad2-bab505c56453 tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Lock "2c950cba-7698-48e0-8852-bf569f58f967" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 529.697s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2071.937532] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1f230113-be24-433b-be89-74b3d5b0776e tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Lock "2c950cba-7698-48e0-8852-bf569f58f967" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 333.938s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2071.937751] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1f230113-be24-433b-be89-74b3d5b0776e tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Acquiring lock "2c950cba-7698-48e0-8852-bf569f58f967-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2071.938070] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1f230113-be24-433b-be89-74b3d5b0776e tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Lock "2c950cba-7698-48e0-8852-bf569f58f967-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2071.938295] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1f230113-be24-433b-be89-74b3d5b0776e tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Lock "2c950cba-7698-48e0-8852-bf569f58f967-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2071.940650] env[68617]: INFO nova.compute.manager [None req-1f230113-be24-433b-be89-74b3d5b0776e tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Terminating instance [ 2071.942570] env[68617]: DEBUG nova.compute.manager [None req-1f230113-be24-433b-be89-74b3d5b0776e tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2071.942782] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-1f230113-be24-433b-be89-74b3d5b0776e tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2071.943333] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-70cdcea9-f2ce-4b0f-9ad0-083950ecb81b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2071.952562] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3064c478-ea9d-4eac-9de5-8a340bbcc321 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2071.981976] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-1f230113-be24-433b-be89-74b3d5b0776e tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 2c950cba-7698-48e0-8852-bf569f58f967 could not be found. [ 2071.982200] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-1f230113-be24-433b-be89-74b3d5b0776e tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2071.982386] env[68617]: INFO nova.compute.manager [None req-1f230113-be24-433b-be89-74b3d5b0776e tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2071.982630] env[68617]: DEBUG oslo.service.loopingcall [None req-1f230113-be24-433b-be89-74b3d5b0776e tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2071.982863] env[68617]: DEBUG nova.compute.manager [-] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2071.982962] env[68617]: DEBUG nova.network.neutron [-] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2072.015536] env[68617]: DEBUG nova.network.neutron [-] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2072.024272] env[68617]: INFO nova.compute.manager [-] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] Took 0.04 seconds to deallocate network for instance. [ 2072.117935] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1f230113-be24-433b-be89-74b3d5b0776e tempest-ServerRescueTestJSON-39379223 tempest-ServerRescueTestJSON-39379223-project-member] Lock "2c950cba-7698-48e0-8852-bf569f58f967" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.179s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2072.117935] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "2c950cba-7698-48e0-8852-bf569f58f967" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 168.468s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2072.117935] env[68617]: INFO nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 2c950cba-7698-48e0-8852-bf569f58f967] During sync_power_state the instance has a pending task (deleting). Skip. [ 2072.117935] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "2c950cba-7698-48e0-8852-bf569f58f967" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2088.747442] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "6a1aa3fb-f182-4b9d-8add-7dfc70472be8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2088.747780] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "6a1aa3fb-f182-4b9d-8add-7dfc70472be8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2088.758201] env[68617]: DEBUG nova.compute.manager [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2088.803945] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2088.804206] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2088.805578] env[68617]: INFO nova.compute.claims [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2088.955334] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4e21e5d-4114-478f-bb1e-1ee345dc0931 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2088.962959] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1cc4e1b-9799-473a-b792-c72a7f5a7cd3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2088.991839] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4985d153-6100-4717-b582-6f69c4c2383c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2088.998774] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9148499a-552b-4dc5-859c-46620e5ba7c6 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2089.011663] env[68617]: DEBUG nova.compute.provider_tree [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2089.020975] env[68617]: DEBUG nova.scheduler.client.report [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2089.033494] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.229s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2089.033942] env[68617]: DEBUG nova.compute.manager [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2089.063740] env[68617]: DEBUG nova.compute.utils [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2089.065167] env[68617]: DEBUG nova.compute.manager [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2089.065285] env[68617]: DEBUG nova.network.neutron [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2089.074701] env[68617]: DEBUG nova.compute.manager [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2089.130328] env[68617]: DEBUG nova.policy [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'be1fb3906fa449949fc0b5eae9cab9fb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1e11c4e5c25a42119594647403c0199b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 2089.137213] env[68617]: DEBUG nova.compute.manager [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2089.162815] env[68617]: DEBUG nova.virt.hardware [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2089.163066] env[68617]: DEBUG nova.virt.hardware [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2089.163415] env[68617]: DEBUG nova.virt.hardware [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2089.163415] env[68617]: DEBUG nova.virt.hardware [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2089.163538] env[68617]: DEBUG nova.virt.hardware [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2089.163677] env[68617]: DEBUG nova.virt.hardware [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2089.163885] env[68617]: DEBUG nova.virt.hardware [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2089.164143] env[68617]: DEBUG nova.virt.hardware [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2089.164217] env[68617]: DEBUG nova.virt.hardware [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2089.164363] env[68617]: DEBUG nova.virt.hardware [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2089.164607] env[68617]: DEBUG nova.virt.hardware [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2089.165390] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-978c0795-46f7-41e6-989e-6e795f7c4799 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2089.173030] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d01b08b-5ef0-45d6-b0a4-1cb41ba45a75 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2089.431860] env[68617]: DEBUG nova.network.neutron [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] Successfully created port: e8123e95-5045-4bc6-9f18-e243a2cee984 {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2089.951168] env[68617]: DEBUG nova.compute.manager [req-f35e6fd5-4e66-4396-a3b6-a8175bc164a5 req-10eff581-3774-4736-8ebc-26bc3e4dbe13 service nova] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] Received event network-vif-plugged-e8123e95-5045-4bc6-9f18-e243a2cee984 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2089.951168] env[68617]: DEBUG oslo_concurrency.lockutils [req-f35e6fd5-4e66-4396-a3b6-a8175bc164a5 req-10eff581-3774-4736-8ebc-26bc3e4dbe13 service nova] Acquiring lock "6a1aa3fb-f182-4b9d-8add-7dfc70472be8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2089.951168] env[68617]: DEBUG oslo_concurrency.lockutils [req-f35e6fd5-4e66-4396-a3b6-a8175bc164a5 req-10eff581-3774-4736-8ebc-26bc3e4dbe13 service nova] Lock "6a1aa3fb-f182-4b9d-8add-7dfc70472be8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2089.951168] env[68617]: DEBUG oslo_concurrency.lockutils [req-f35e6fd5-4e66-4396-a3b6-a8175bc164a5 req-10eff581-3774-4736-8ebc-26bc3e4dbe13 service nova] Lock "6a1aa3fb-f182-4b9d-8add-7dfc70472be8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2089.951520] env[68617]: DEBUG nova.compute.manager [req-f35e6fd5-4e66-4396-a3b6-a8175bc164a5 req-10eff581-3774-4736-8ebc-26bc3e4dbe13 service nova] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] No waiting events found dispatching network-vif-plugged-e8123e95-5045-4bc6-9f18-e243a2cee984 {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2089.951520] env[68617]: WARNING nova.compute.manager [req-f35e6fd5-4e66-4396-a3b6-a8175bc164a5 req-10eff581-3774-4736-8ebc-26bc3e4dbe13 service nova] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] Received unexpected event network-vif-plugged-e8123e95-5045-4bc6-9f18-e243a2cee984 for instance with vm_state building and task_state spawning. [ 2090.026602] env[68617]: DEBUG nova.network.neutron [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] Successfully updated port: e8123e95-5045-4bc6-9f18-e243a2cee984 {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2090.036039] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "refresh_cache-6a1aa3fb-f182-4b9d-8add-7dfc70472be8" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2090.036191] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquired lock "refresh_cache-6a1aa3fb-f182-4b9d-8add-7dfc70472be8" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2090.036341] env[68617]: DEBUG nova.network.neutron [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2090.075603] env[68617]: DEBUG nova.network.neutron [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2090.221829] env[68617]: DEBUG nova.network.neutron [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] Updating instance_info_cache with network_info: [{"id": "e8123e95-5045-4bc6-9f18-e243a2cee984", "address": "fa:16:3e:ef:f7:ab", "network": {"id": "1d9c32bb-1c81-4af6-8d3f-365a52df11cd", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-313904480-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1e11c4e5c25a42119594647403c0199b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d62c1cf-f39a-4626-9552-f1e13c692636", "external-id": "nsx-vlan-transportzone-748", "segmentation_id": 748, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape8123e95-50", "ovs_interfaceid": "e8123e95-5045-4bc6-9f18-e243a2cee984", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2090.233600] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Releasing lock "refresh_cache-6a1aa3fb-f182-4b9d-8add-7dfc70472be8" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2090.233864] env[68617]: DEBUG nova.compute.manager [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] Instance network_info: |[{"id": "e8123e95-5045-4bc6-9f18-e243a2cee984", "address": "fa:16:3e:ef:f7:ab", "network": {"id": "1d9c32bb-1c81-4af6-8d3f-365a52df11cd", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-313904480-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1e11c4e5c25a42119594647403c0199b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d62c1cf-f39a-4626-9552-f1e13c692636", "external-id": "nsx-vlan-transportzone-748", "segmentation_id": 748, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape8123e95-50", "ovs_interfaceid": "e8123e95-5045-4bc6-9f18-e243a2cee984", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2090.234244] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ef:f7:ab', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6d62c1cf-f39a-4626-9552-f1e13c692636', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e8123e95-5045-4bc6-9f18-e243a2cee984', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2090.241822] env[68617]: DEBUG oslo.service.loopingcall [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2090.242247] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2090.242469] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8d644f2c-a8c3-450c-aafc-2d45c77b9636 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2090.261765] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2090.261765] env[68617]: value = "task-3470894" [ 2090.261765] env[68617]: _type = "Task" [ 2090.261765] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2090.269299] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470894, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2090.772415] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470894, 'name': CreateVM_Task, 'duration_secs': 0.273347} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2090.772552] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2090.773295] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2090.773463] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2090.773779] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2090.774065] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9d8446d6-31d5-4a61-9577-0dddda83c332 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2090.778446] env[68617]: DEBUG oslo_vmware.api [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Waiting for the task: (returnval){ [ 2090.778446] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]5224c195-0b97-8d62-83f2-61c8c48775c2" [ 2090.778446] env[68617]: _type = "Task" [ 2090.778446] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2090.785906] env[68617]: DEBUG oslo_vmware.api [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]5224c195-0b97-8d62-83f2-61c8c48775c2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2091.288830] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2091.289117] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2091.289383] env[68617]: DEBUG oslo_concurrency.lockutils [None req-6daaac69-5952-46ee-ba87-f4c389afde99 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2091.981549] env[68617]: DEBUG nova.compute.manager [req-db47a88c-f053-46f8-9898-8d6d81e9281b req-1beef7c6-5c77-4695-b74e-c7ed51cde8f8 service nova] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] Received event network-changed-e8123e95-5045-4bc6-9f18-e243a2cee984 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2091.981762] env[68617]: DEBUG nova.compute.manager [req-db47a88c-f053-46f8-9898-8d6d81e9281b req-1beef7c6-5c77-4695-b74e-c7ed51cde8f8 service nova] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] Refreshing instance network info cache due to event network-changed-e8123e95-5045-4bc6-9f18-e243a2cee984. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2091.981981] env[68617]: DEBUG oslo_concurrency.lockutils [req-db47a88c-f053-46f8-9898-8d6d81e9281b req-1beef7c6-5c77-4695-b74e-c7ed51cde8f8 service nova] Acquiring lock "refresh_cache-6a1aa3fb-f182-4b9d-8add-7dfc70472be8" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2091.982145] env[68617]: DEBUG oslo_concurrency.lockutils [req-db47a88c-f053-46f8-9898-8d6d81e9281b req-1beef7c6-5c77-4695-b74e-c7ed51cde8f8 service nova] Acquired lock "refresh_cache-6a1aa3fb-f182-4b9d-8add-7dfc70472be8" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2091.982310] env[68617]: DEBUG nova.network.neutron [req-db47a88c-f053-46f8-9898-8d6d81e9281b req-1beef7c6-5c77-4695-b74e-c7ed51cde8f8 service nova] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] Refreshing network info cache for port e8123e95-5045-4bc6-9f18-e243a2cee984 {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2092.268547] env[68617]: DEBUG nova.network.neutron [req-db47a88c-f053-46f8-9898-8d6d81e9281b req-1beef7c6-5c77-4695-b74e-c7ed51cde8f8 service nova] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] Updated VIF entry in instance network info cache for port e8123e95-5045-4bc6-9f18-e243a2cee984. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2092.268887] env[68617]: DEBUG nova.network.neutron [req-db47a88c-f053-46f8-9898-8d6d81e9281b req-1beef7c6-5c77-4695-b74e-c7ed51cde8f8 service nova] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] Updating instance_info_cache with network_info: [{"id": "e8123e95-5045-4bc6-9f18-e243a2cee984", "address": "fa:16:3e:ef:f7:ab", "network": {"id": "1d9c32bb-1c81-4af6-8d3f-365a52df11cd", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-313904480-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1e11c4e5c25a42119594647403c0199b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d62c1cf-f39a-4626-9552-f1e13c692636", "external-id": "nsx-vlan-transportzone-748", "segmentation_id": 748, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape8123e95-50", "ovs_interfaceid": "e8123e95-5045-4bc6-9f18-e243a2cee984", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2092.279415] env[68617]: DEBUG oslo_concurrency.lockutils [req-db47a88c-f053-46f8-9898-8d6d81e9281b req-1beef7c6-5c77-4695-b74e-c7ed51cde8f8 service nova] Releasing lock "refresh_cache-6a1aa3fb-f182-4b9d-8add-7dfc70472be8" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2098.694455] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2098.717320] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2098.717512] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Starting heal instance info cache {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2098.717630] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Rebuilding the list of instances to heal {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2098.737574] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2098.737733] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2098.737867] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2098.737993] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2098.738135] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2098.738255] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2098.738375] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2098.738494] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2098.738612] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2098.738733] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2098.738892] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Didn't find any instances for network info cache update. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2100.698982] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2102.700048] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2102.700288] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2102.700349] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2102.700494] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68617) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2104.701703] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2105.699453] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2105.711431] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2105.711747] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2105.711853] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2105.711948] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2105.713482] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7db66dc-3f68-4f15-ac78-23969140cfd5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2105.721834] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9382bb8-cfbc-4418-83e9-3a18c5c8bcf1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2105.735630] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37b16811-68ab-4077-bb0b-ff2d656bc949 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2105.742046] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29f2a54e-22d2-46c4-b52e-c1336e2cc018 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2105.770674] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180909MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2105.770908] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2105.771048] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2105.845122] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 12ed2a40-3d74-49a2-95b4-ccaaf58c8060 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2105.845305] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 21d0560a-fde3-4c16-b2fc-06d6f8668a7a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2105.845438] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 902b5ab9-23b8-450f-853a-b2da889c3afd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2105.845563] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 922c8926-c636-4463-85d6-4f2a6325b85a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2105.845805] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b1a8dc60-af98-4f80-96cf-b2550ea8c13a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2105.845805] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance a4ab788d-327a-47cc-8ae7-e1b9be889759 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2105.845913] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 17bb8415-dafd-47ed-9a14-52163ba5e7db actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2105.846050] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 82f72313-f493-4acd-a95e-765feb74a358 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2105.846142] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 797b434e-a913-43dc-a1df-39fe82da1221 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2105.846259] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6a1aa3fb-f182-4b9d-8add-7dfc70472be8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2105.846451] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2105.846590] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2105.964023] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79b357a0-0ff8-4208-ac08-3b2f47921ec5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2105.971612] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6dff9272-3e17-461a-889b-dda99b2de38a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2106.000382] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1e57282-5b3a-47e1-a98b-35b3ad8542e2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2106.009243] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e5b0031-c041-4be5-95c2-8c5b47cf325e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2106.019838] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2106.028178] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2106.045751] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2106.045947] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.275s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2108.041899] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2108.042202] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2115.825767] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d97fd8ce-beab-46d3-890b-7260e3649488 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Acquiring lock "17bb8415-dafd-47ed-9a14-52163ba5e7db" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2119.239683] env[68617]: WARNING oslo_vmware.rw_handles [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2119.239683] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2119.239683] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2119.239683] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2119.239683] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2119.239683] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 2119.239683] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2119.239683] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2119.239683] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2119.239683] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2119.239683] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2119.239683] env[68617]: ERROR oslo_vmware.rw_handles [ 2119.240437] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/199cd782-0cf4-43b1-8923-6006dc07a64d/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2119.242269] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2119.242508] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Copying Virtual Disk [datastore2] vmware_temp/199cd782-0cf4-43b1-8923-6006dc07a64d/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/199cd782-0cf4-43b1-8923-6006dc07a64d/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2119.242860] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-daabc4b2-9a88-4599-8da7-502195ebf8ee {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2119.251013] env[68617]: DEBUG oslo_vmware.api [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Waiting for the task: (returnval){ [ 2119.251013] env[68617]: value = "task-3470895" [ 2119.251013] env[68617]: _type = "Task" [ 2119.251013] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2119.259050] env[68617]: DEBUG oslo_vmware.api [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Task: {'id': task-3470895, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2119.761777] env[68617]: DEBUG oslo_vmware.exceptions [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2119.762132] env[68617]: DEBUG oslo_concurrency.lockutils [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2119.762771] env[68617]: ERROR nova.compute.manager [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2119.762771] env[68617]: Faults: ['InvalidArgument'] [ 2119.762771] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Traceback (most recent call last): [ 2119.762771] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2119.762771] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] yield resources [ 2119.762771] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2119.762771] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] self.driver.spawn(context, instance, image_meta, [ 2119.762771] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2119.762771] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2119.762771] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2119.762771] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] self._fetch_image_if_missing(context, vi) [ 2119.762771] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2119.763272] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] image_cache(vi, tmp_image_ds_loc) [ 2119.763272] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2119.763272] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] vm_util.copy_virtual_disk( [ 2119.763272] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2119.763272] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] session._wait_for_task(vmdk_copy_task) [ 2119.763272] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2119.763272] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] return self.wait_for_task(task_ref) [ 2119.763272] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2119.763272] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] return evt.wait() [ 2119.763272] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2119.763272] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] result = hub.switch() [ 2119.763272] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2119.763272] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] return self.greenlet.switch() [ 2119.763788] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2119.763788] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] self.f(*self.args, **self.kw) [ 2119.763788] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2119.763788] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] raise exceptions.translate_fault(task_info.error) [ 2119.763788] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2119.763788] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Faults: ['InvalidArgument'] [ 2119.763788] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] [ 2119.763788] env[68617]: INFO nova.compute.manager [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Terminating instance [ 2119.764872] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2119.765136] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2119.765469] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7d88c4b2-7755-43e1-9a6e-bcdcd4ed1bba {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2119.767814] env[68617]: DEBUG nova.compute.manager [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2119.768049] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2119.768862] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b86905b6-3281-48ec-a528-9ece15b5200b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2119.776239] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2119.776471] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1b42f283-0f99-43a5-8424-ed24b15a543f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2119.778866] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2119.779083] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2119.780112] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6c8b322b-80ff-4ff4-9ae7-cfbdf8fa74f6 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2119.784880] env[68617]: DEBUG oslo_vmware.api [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Waiting for the task: (returnval){ [ 2119.784880] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]525d0ee9-328c-5478-8830-42fc63f45452" [ 2119.784880] env[68617]: _type = "Task" [ 2119.784880] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2119.792816] env[68617]: DEBUG oslo_vmware.api [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]525d0ee9-328c-5478-8830-42fc63f45452, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2119.842741] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2119.842968] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2119.843173] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Deleting the datastore file [datastore2] 12ed2a40-3d74-49a2-95b4-ccaaf58c8060 {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2119.843444] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ea2baf32-e3d9-4b54-b2ce-ad324200d1a2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2119.850048] env[68617]: DEBUG oslo_vmware.api [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Waiting for the task: (returnval){ [ 2119.850048] env[68617]: value = "task-3470897" [ 2119.850048] env[68617]: _type = "Task" [ 2119.850048] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2119.857330] env[68617]: DEBUG oslo_vmware.api [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Task: {'id': task-3470897, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2120.295481] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2120.295755] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Creating directory with path [datastore2] vmware_temp/941d469a-bcfc-4493-a170-adb501972687/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2120.295929] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cb81feb2-b585-49a3-b12b-b4c619e8b091 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2120.308014] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Created directory with path [datastore2] vmware_temp/941d469a-bcfc-4493-a170-adb501972687/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2120.308213] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Fetch image to [datastore2] vmware_temp/941d469a-bcfc-4493-a170-adb501972687/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2120.308381] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/941d469a-bcfc-4493-a170-adb501972687/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2120.309085] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8b5b08e-5274-477c-a762-5568ce8217d1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2120.315459] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c50bc9d-4003-4e59-ba46-2905293a1b01 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2120.323907] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-916e0fac-efe9-4e2c-9b74-b31e9ac60340 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2120.357473] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf424dea-4812-438f-9105-808b9a5a0c55 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2120.364304] env[68617]: DEBUG oslo_vmware.api [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Task: {'id': task-3470897, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.08601} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2120.365754] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2120.365941] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2120.366125] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2120.366296] env[68617]: INFO nova.compute.manager [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2120.367976] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5f1b1a6a-f0b9-4846-b21a-92cd921e4f6a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2120.369777] env[68617]: DEBUG nova.compute.claims [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2120.369958] env[68617]: DEBUG oslo_concurrency.lockutils [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2120.370181] env[68617]: DEBUG oslo_concurrency.lockutils [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2120.391056] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2120.441954] env[68617]: DEBUG oslo_vmware.rw_handles [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/941d469a-bcfc-4493-a170-adb501972687/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2120.501162] env[68617]: DEBUG oslo_vmware.rw_handles [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2120.501367] env[68617]: DEBUG oslo_vmware.rw_handles [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/941d469a-bcfc-4493-a170-adb501972687/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2120.578584] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e8c2c13-7e8a-4135-9407-e4eda79f384f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2120.586172] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d679245c-7f4f-4464-ad0d-f2fce9e28447 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2120.614769] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad80202c-845c-43ec-a936-fa6c654aff35 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2120.621491] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ded063e4-5142-494a-be0f-08c97ac7b616 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2120.634645] env[68617]: DEBUG nova.compute.provider_tree [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2120.643246] env[68617]: DEBUG nova.scheduler.client.report [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2120.657861] env[68617]: DEBUG oslo_concurrency.lockutils [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.288s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2120.658374] env[68617]: ERROR nova.compute.manager [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2120.658374] env[68617]: Faults: ['InvalidArgument'] [ 2120.658374] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Traceback (most recent call last): [ 2120.658374] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2120.658374] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] self.driver.spawn(context, instance, image_meta, [ 2120.658374] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2120.658374] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2120.658374] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2120.658374] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] self._fetch_image_if_missing(context, vi) [ 2120.658374] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2120.658374] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] image_cache(vi, tmp_image_ds_loc) [ 2120.658374] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2120.658714] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] vm_util.copy_virtual_disk( [ 2120.658714] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2120.658714] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] session._wait_for_task(vmdk_copy_task) [ 2120.658714] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2120.658714] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] return self.wait_for_task(task_ref) [ 2120.658714] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2120.658714] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] return evt.wait() [ 2120.658714] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2120.658714] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] result = hub.switch() [ 2120.658714] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2120.658714] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] return self.greenlet.switch() [ 2120.658714] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2120.658714] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] self.f(*self.args, **self.kw) [ 2120.659062] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2120.659062] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] raise exceptions.translate_fault(task_info.error) [ 2120.659062] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2120.659062] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Faults: ['InvalidArgument'] [ 2120.659062] env[68617]: ERROR nova.compute.manager [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] [ 2120.659062] env[68617]: DEBUG nova.compute.utils [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2120.660462] env[68617]: DEBUG nova.compute.manager [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Build of instance 12ed2a40-3d74-49a2-95b4-ccaaf58c8060 was re-scheduled: A specified parameter was not correct: fileType [ 2120.660462] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2120.660823] env[68617]: DEBUG nova.compute.manager [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2120.661022] env[68617]: DEBUG nova.compute.manager [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2120.661201] env[68617]: DEBUG nova.compute.manager [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2120.661362] env[68617]: DEBUG nova.network.neutron [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2120.969202] env[68617]: DEBUG nova.network.neutron [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2120.979774] env[68617]: INFO nova.compute.manager [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Took 0.32 seconds to deallocate network for instance. [ 2121.076473] env[68617]: INFO nova.scheduler.client.report [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Deleted allocations for instance 12ed2a40-3d74-49a2-95b4-ccaaf58c8060 [ 2121.097023] env[68617]: DEBUG oslo_concurrency.lockutils [None req-928ebe49-1896-41d7-a753-0721cc7a7669 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Lock "12ed2a40-3d74-49a2-95b4-ccaaf58c8060" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 571.979s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2121.097185] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2c94f0c7-c213-4bc7-8690-5eb28181b87f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Lock "12ed2a40-3d74-49a2-95b4-ccaaf58c8060" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 375.872s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2121.097396] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2c94f0c7-c213-4bc7-8690-5eb28181b87f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquiring lock "12ed2a40-3d74-49a2-95b4-ccaaf58c8060-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2121.097604] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2c94f0c7-c213-4bc7-8690-5eb28181b87f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Lock "12ed2a40-3d74-49a2-95b4-ccaaf58c8060-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2121.097771] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2c94f0c7-c213-4bc7-8690-5eb28181b87f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Lock "12ed2a40-3d74-49a2-95b4-ccaaf58c8060-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2121.099741] env[68617]: INFO nova.compute.manager [None req-2c94f0c7-c213-4bc7-8690-5eb28181b87f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Terminating instance [ 2121.101467] env[68617]: DEBUG nova.compute.manager [None req-2c94f0c7-c213-4bc7-8690-5eb28181b87f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2121.101656] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-2c94f0c7-c213-4bc7-8690-5eb28181b87f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2121.102139] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-cbcbd262-dcd2-4570-8c55-16be638dc153 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2121.111253] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbf46173-eefb-4e52-b3fd-b61e67d201e1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2121.139440] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-2c94f0c7-c213-4bc7-8690-5eb28181b87f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 12ed2a40-3d74-49a2-95b4-ccaaf58c8060 could not be found. [ 2121.139627] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-2c94f0c7-c213-4bc7-8690-5eb28181b87f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2121.139800] env[68617]: INFO nova.compute.manager [None req-2c94f0c7-c213-4bc7-8690-5eb28181b87f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2121.140078] env[68617]: DEBUG oslo.service.loopingcall [None req-2c94f0c7-c213-4bc7-8690-5eb28181b87f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2121.140314] env[68617]: DEBUG nova.compute.manager [-] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2121.140409] env[68617]: DEBUG nova.network.neutron [-] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2121.163031] env[68617]: DEBUG nova.network.neutron [-] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2121.170936] env[68617]: INFO nova.compute.manager [-] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] Took 0.03 seconds to deallocate network for instance. [ 2121.251498] env[68617]: DEBUG oslo_concurrency.lockutils [None req-2c94f0c7-c213-4bc7-8690-5eb28181b87f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Lock "12ed2a40-3d74-49a2-95b4-ccaaf58c8060" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.154s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2121.252292] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "12ed2a40-3d74-49a2-95b4-ccaaf58c8060" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 217.602s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2121.252472] env[68617]: INFO nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 12ed2a40-3d74-49a2-95b4-ccaaf58c8060] During sync_power_state the instance has a pending task (deleting). Skip. [ 2121.252647] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "12ed2a40-3d74-49a2-95b4-ccaaf58c8060" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2127.710901] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0b8ea8c9-9b88-4e1e-98d8-6a2daeda0c01 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "82f72313-f493-4acd-a95e-765feb74a358" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2138.710944] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c97aa9c5-7dac-485f-b5ba-2e60ece60999 tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquiring lock "797b434e-a913-43dc-a1df-39fe82da1221" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2155.702213] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2157.707891] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2157.708214] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Cleaning up deleted instances {{(pid=68617) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 2157.717525] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] There are 0 instances to clean {{(pid=68617) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 2160.710691] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2160.710691] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Starting heal instance info cache {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2160.710691] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Rebuilding the list of instances to heal {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2160.729184] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2160.729342] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2160.729473] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2160.729596] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2160.729718] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2160.729835] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2160.729951] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2160.730383] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2160.730383] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2160.730558] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Didn't find any instances for network info cache update. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2161.699792] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2161.699973] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Cleaning up deleted instances with incomplete migration {{(pid=68617) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 2162.709435] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2162.709729] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2164.700075] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2164.700418] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2164.700457] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68617) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2165.700267] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2167.641117] env[68617]: WARNING oslo_vmware.rw_handles [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2167.641117] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2167.641117] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2167.641117] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2167.641117] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2167.641117] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 2167.641117] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2167.641117] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2167.641117] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2167.641117] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2167.641117] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2167.641117] env[68617]: ERROR oslo_vmware.rw_handles [ 2167.642114] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/941d469a-bcfc-4493-a170-adb501972687/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2167.643628] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2167.643870] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Copying Virtual Disk [datastore2] vmware_temp/941d469a-bcfc-4493-a170-adb501972687/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/941d469a-bcfc-4493-a170-adb501972687/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2167.644175] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-051ff90c-8034-493c-9a57-1b7d9345fb92 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2167.652155] env[68617]: DEBUG oslo_vmware.api [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Waiting for the task: (returnval){ [ 2167.652155] env[68617]: value = "task-3470898" [ 2167.652155] env[68617]: _type = "Task" [ 2167.652155] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2167.660462] env[68617]: DEBUG oslo_vmware.api [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Task: {'id': task-3470898, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2167.694143] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2167.698809] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2167.729775] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2167.730034] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2167.730198] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2167.730350] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2167.731465] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-383ae57e-2c01-4284-938d-c7715bc00f0d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2167.739804] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36191e72-5ae4-49bf-803f-b2687bbe7a71 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2167.754642] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44cdfd2c-d148-4fc2-883c-946733f6298b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2167.760963] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49c4fbbb-2386-491f-a9e5-af4e4ad220bc {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2167.791155] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180900MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2167.791273] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2167.791532] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2167.884961] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 21d0560a-fde3-4c16-b2fc-06d6f8668a7a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2167.885145] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 902b5ab9-23b8-450f-853a-b2da889c3afd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2167.885274] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 922c8926-c636-4463-85d6-4f2a6325b85a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2167.885398] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b1a8dc60-af98-4f80-96cf-b2550ea8c13a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2167.885517] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance a4ab788d-327a-47cc-8ae7-e1b9be889759 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2167.885636] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 17bb8415-dafd-47ed-9a14-52163ba5e7db actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2167.885753] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 82f72313-f493-4acd-a95e-765feb74a358 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2167.885919] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 797b434e-a913-43dc-a1df-39fe82da1221 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2167.886063] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6a1aa3fb-f182-4b9d-8add-7dfc70472be8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2167.886257] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2167.886394] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2167.901322] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Refreshing inventories for resource provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 2167.914261] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Updating ProviderTree inventory for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 2167.914444] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Updating inventory in ProviderTree for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2167.924358] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Refreshing aggregate associations for resource provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f, aggregates: None {{(pid=68617) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 2167.943351] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Refreshing trait associations for resource provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_VMDK {{(pid=68617) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 2168.045383] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8a18d75-65f7-42cc-8b00-b1678f16dc28 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2168.053450] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62f62e2c-1863-4b97-be66-1cfcb26e93ec {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2168.083369] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c755e26-05b9-4805-8257-ed52f3d1a278 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2168.090296] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-769a1953-770f-45ca-8dea-4dca1592c274 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2168.103230] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2168.111842] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2168.127294] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2168.127473] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.336s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2168.162243] env[68617]: DEBUG oslo_vmware.exceptions [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2168.162501] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2168.163071] env[68617]: ERROR nova.compute.manager [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2168.163071] env[68617]: Faults: ['InvalidArgument'] [ 2168.163071] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Traceback (most recent call last): [ 2168.163071] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2168.163071] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] yield resources [ 2168.163071] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2168.163071] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] self.driver.spawn(context, instance, image_meta, [ 2168.163071] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2168.163071] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2168.163071] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2168.163071] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] self._fetch_image_if_missing(context, vi) [ 2168.163071] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2168.163451] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] image_cache(vi, tmp_image_ds_loc) [ 2168.163451] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2168.163451] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] vm_util.copy_virtual_disk( [ 2168.163451] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2168.163451] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] session._wait_for_task(vmdk_copy_task) [ 2168.163451] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2168.163451] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] return self.wait_for_task(task_ref) [ 2168.163451] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2168.163451] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] return evt.wait() [ 2168.163451] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2168.163451] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] result = hub.switch() [ 2168.163451] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2168.163451] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] return self.greenlet.switch() [ 2168.163867] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2168.163867] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] self.f(*self.args, **self.kw) [ 2168.163867] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2168.163867] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] raise exceptions.translate_fault(task_info.error) [ 2168.163867] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2168.163867] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Faults: ['InvalidArgument'] [ 2168.163867] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] [ 2168.163867] env[68617]: INFO nova.compute.manager [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Terminating instance [ 2168.164765] env[68617]: DEBUG oslo_concurrency.lockutils [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2168.164978] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2168.165248] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2cab330f-19eb-4ae3-a9e4-cee8b50beea2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2168.167479] env[68617]: DEBUG nova.compute.manager [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2168.167671] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2168.168382] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa55e068-1cd1-47ec-87e4-c4824c63452d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2168.174996] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2168.175222] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-a709d741-5b11-4f49-a186-ffba40642225 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2168.177335] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2168.177505] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2168.178407] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-02333faa-c593-44c1-90aa-583b697232b5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2168.183088] env[68617]: DEBUG oslo_vmware.api [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Waiting for the task: (returnval){ [ 2168.183088] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52e762dd-9c11-2a41-d2d2-e243b005f321" [ 2168.183088] env[68617]: _type = "Task" [ 2168.183088] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2168.189891] env[68617]: DEBUG oslo_vmware.api [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52e762dd-9c11-2a41-d2d2-e243b005f321, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2168.238517] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2168.238778] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2168.238962] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Deleting the datastore file [datastore2] 21d0560a-fde3-4c16-b2fc-06d6f8668a7a {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2168.239229] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6c5f5b99-52e3-45b6-9ebd-885e53a1b006 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2168.245200] env[68617]: DEBUG oslo_vmware.api [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Waiting for the task: (returnval){ [ 2168.245200] env[68617]: value = "task-3470900" [ 2168.245200] env[68617]: _type = "Task" [ 2168.245200] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2168.252576] env[68617]: DEBUG oslo_vmware.api [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Task: {'id': task-3470900, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2168.693481] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2168.693481] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Creating directory with path [datastore2] vmware_temp/94f8aed1-4661-4540-be4c-80caf821e1a5/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2168.693919] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-98747e9c-dbc2-44a4-91be-8489eae70f74 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2168.706017] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Created directory with path [datastore2] vmware_temp/94f8aed1-4661-4540-be4c-80caf821e1a5/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2168.706182] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Fetch image to [datastore2] vmware_temp/94f8aed1-4661-4540-be4c-80caf821e1a5/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2168.706428] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/94f8aed1-4661-4540-be4c-80caf821e1a5/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2168.707083] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff330f03-fedb-4f31-b3bd-5c4f5005c8f0 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2168.713844] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbe4308d-57bc-40f2-9d35-8b88379a86c7 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2168.723025] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4e2b73b-ab4a-4d42-b6fc-447ee500a132 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2168.756923] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0921a4d-e768-401f-9638-d52baf649f91 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2168.763818] env[68617]: DEBUG oslo_vmware.api [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Task: {'id': task-3470900, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075023} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2168.765205] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2168.765394] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2168.765564] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2168.765737] env[68617]: INFO nova.compute.manager [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2168.767712] env[68617]: DEBUG nova.compute.claims [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2168.767881] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2168.768103] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2168.770678] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0eacac66-dba9-4cb3-8c24-b9ffd3dc7a71 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2168.791807] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2168.881556] env[68617]: DEBUG oslo_vmware.rw_handles [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/94f8aed1-4661-4540-be4c-80caf821e1a5/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2168.941869] env[68617]: DEBUG oslo_vmware.rw_handles [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2168.942068] env[68617]: DEBUG oslo_vmware.rw_handles [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/94f8aed1-4661-4540-be4c-80caf821e1a5/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2168.984474] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18f6c9e6-6815-4217-8284-72bff82b7ddd {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2168.992158] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3cad35b-4d0a-4141-8773-5f5b3ee3fe66 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2169.020948] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc98589b-a4d2-4998-a628-2b1a7f81fdb4 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2169.027591] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2694b69d-10bf-442a-9fe9-e99f8c22b6f4 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2169.041060] env[68617]: DEBUG nova.compute.provider_tree [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2169.049363] env[68617]: DEBUG nova.scheduler.client.report [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2169.063627] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.295s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2169.064156] env[68617]: ERROR nova.compute.manager [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2169.064156] env[68617]: Faults: ['InvalidArgument'] [ 2169.064156] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Traceback (most recent call last): [ 2169.064156] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2169.064156] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] self.driver.spawn(context, instance, image_meta, [ 2169.064156] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2169.064156] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2169.064156] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2169.064156] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] self._fetch_image_if_missing(context, vi) [ 2169.064156] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2169.064156] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] image_cache(vi, tmp_image_ds_loc) [ 2169.064156] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2169.064515] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] vm_util.copy_virtual_disk( [ 2169.064515] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2169.064515] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] session._wait_for_task(vmdk_copy_task) [ 2169.064515] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2169.064515] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] return self.wait_for_task(task_ref) [ 2169.064515] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2169.064515] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] return evt.wait() [ 2169.064515] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2169.064515] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] result = hub.switch() [ 2169.064515] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2169.064515] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] return self.greenlet.switch() [ 2169.064515] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2169.064515] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] self.f(*self.args, **self.kw) [ 2169.064906] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2169.064906] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] raise exceptions.translate_fault(task_info.error) [ 2169.064906] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2169.064906] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Faults: ['InvalidArgument'] [ 2169.064906] env[68617]: ERROR nova.compute.manager [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] [ 2169.064906] env[68617]: DEBUG nova.compute.utils [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2169.066298] env[68617]: DEBUG nova.compute.manager [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Build of instance 21d0560a-fde3-4c16-b2fc-06d6f8668a7a was re-scheduled: A specified parameter was not correct: fileType [ 2169.066298] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2169.066715] env[68617]: DEBUG nova.compute.manager [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2169.066891] env[68617]: DEBUG nova.compute.manager [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2169.067077] env[68617]: DEBUG nova.compute.manager [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2169.067241] env[68617]: DEBUG nova.network.neutron [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2169.407691] env[68617]: DEBUG nova.network.neutron [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2169.425431] env[68617]: INFO nova.compute.manager [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Took 0.36 seconds to deallocate network for instance. [ 2169.524771] env[68617]: INFO nova.scheduler.client.report [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Deleted allocations for instance 21d0560a-fde3-4c16-b2fc-06d6f8668a7a [ 2169.545044] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4e2d81a-cdeb-4b0e-b426-0098cf896f8a tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Lock "21d0560a-fde3-4c16-b2fc-06d6f8668a7a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 611.258s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2169.545313] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c5e2dd77-afa9-470f-9b6b-7c9598bbad7d tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Lock "21d0560a-fde3-4c16-b2fc-06d6f8668a7a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 415.182s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2169.545524] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c5e2dd77-afa9-470f-9b6b-7c9598bbad7d tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Acquiring lock "21d0560a-fde3-4c16-b2fc-06d6f8668a7a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2169.545726] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c5e2dd77-afa9-470f-9b6b-7c9598bbad7d tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Lock "21d0560a-fde3-4c16-b2fc-06d6f8668a7a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2169.545892] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c5e2dd77-afa9-470f-9b6b-7c9598bbad7d tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Lock "21d0560a-fde3-4c16-b2fc-06d6f8668a7a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2169.547678] env[68617]: INFO nova.compute.manager [None req-c5e2dd77-afa9-470f-9b6b-7c9598bbad7d tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Terminating instance [ 2169.549375] env[68617]: DEBUG nova.compute.manager [None req-c5e2dd77-afa9-470f-9b6b-7c9598bbad7d tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2169.549566] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-c5e2dd77-afa9-470f-9b6b-7c9598bbad7d tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2169.550026] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5f8db587-a2a8-439f-bd2c-3b6db562f15f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2169.558921] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e10bebda-810b-4003-964d-420b2c2831c8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2169.586218] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-c5e2dd77-afa9-470f-9b6b-7c9598bbad7d tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 21d0560a-fde3-4c16-b2fc-06d6f8668a7a could not be found. [ 2169.586410] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-c5e2dd77-afa9-470f-9b6b-7c9598bbad7d tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2169.586580] env[68617]: INFO nova.compute.manager [None req-c5e2dd77-afa9-470f-9b6b-7c9598bbad7d tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2169.586811] env[68617]: DEBUG oslo.service.loopingcall [None req-c5e2dd77-afa9-470f-9b6b-7c9598bbad7d tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2169.587047] env[68617]: DEBUG nova.compute.manager [-] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2169.587144] env[68617]: DEBUG nova.network.neutron [-] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2169.611539] env[68617]: DEBUG nova.network.neutron [-] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2169.618818] env[68617]: INFO nova.compute.manager [-] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] Took 0.03 seconds to deallocate network for instance. [ 2169.704079] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c5e2dd77-afa9-470f-9b6b-7c9598bbad7d tempest-ServerActionsTestOtherA-2016970503 tempest-ServerActionsTestOtherA-2016970503-project-member] Lock "21d0560a-fde3-4c16-b2fc-06d6f8668a7a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.159s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2169.704846] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "21d0560a-fde3-4c16-b2fc-06d6f8668a7a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 266.055s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2169.705042] env[68617]: INFO nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 21d0560a-fde3-4c16-b2fc-06d6f8668a7a] During sync_power_state the instance has a pending task (deleting). Skip. [ 2169.705225] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "21d0560a-fde3-4c16-b2fc-06d6f8668a7a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2170.128358] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2215.153048] env[68617]: WARNING oslo_vmware.rw_handles [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2215.153048] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2215.153048] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2215.153048] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2215.153048] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2215.153048] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 2215.153048] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2215.153048] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2215.153048] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2215.153048] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2215.153048] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2215.153048] env[68617]: ERROR oslo_vmware.rw_handles [ 2215.154208] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/94f8aed1-4661-4540-be4c-80caf821e1a5/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2215.155452] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2215.155684] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Copying Virtual Disk [datastore2] vmware_temp/94f8aed1-4661-4540-be4c-80caf821e1a5/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/94f8aed1-4661-4540-be4c-80caf821e1a5/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2215.155960] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6dd7f56f-6d8b-436d-ae33-5d7bf4bfda00 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2215.164793] env[68617]: DEBUG oslo_vmware.api [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Waiting for the task: (returnval){ [ 2215.164793] env[68617]: value = "task-3470901" [ 2215.164793] env[68617]: _type = "Task" [ 2215.164793] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2215.172653] env[68617]: DEBUG oslo_vmware.api [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Task: {'id': task-3470901, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2215.675747] env[68617]: DEBUG oslo_vmware.exceptions [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2215.676087] env[68617]: DEBUG oslo_concurrency.lockutils [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2215.676652] env[68617]: ERROR nova.compute.manager [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2215.676652] env[68617]: Faults: ['InvalidArgument'] [ 2215.676652] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Traceback (most recent call last): [ 2215.676652] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2215.676652] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] yield resources [ 2215.676652] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2215.676652] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] self.driver.spawn(context, instance, image_meta, [ 2215.676652] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2215.676652] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2215.676652] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2215.676652] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] self._fetch_image_if_missing(context, vi) [ 2215.676652] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2215.677021] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] image_cache(vi, tmp_image_ds_loc) [ 2215.677021] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2215.677021] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] vm_util.copy_virtual_disk( [ 2215.677021] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2215.677021] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] session._wait_for_task(vmdk_copy_task) [ 2215.677021] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2215.677021] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] return self.wait_for_task(task_ref) [ 2215.677021] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2215.677021] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] return evt.wait() [ 2215.677021] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2215.677021] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] result = hub.switch() [ 2215.677021] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2215.677021] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] return self.greenlet.switch() [ 2215.677393] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2215.677393] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] self.f(*self.args, **self.kw) [ 2215.677393] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2215.677393] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] raise exceptions.translate_fault(task_info.error) [ 2215.677393] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2215.677393] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Faults: ['InvalidArgument'] [ 2215.677393] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] [ 2215.677393] env[68617]: INFO nova.compute.manager [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Terminating instance [ 2215.678604] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2215.678815] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2215.680111] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e7d40a4d-efb4-4c58-a8ab-748c6ef51548 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2215.682512] env[68617]: DEBUG nova.compute.manager [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2215.682700] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2215.683425] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9b76950-5fc4-4fe7-8653-4f304c18a303 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2215.689925] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2215.690183] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9b0a0c89-7014-4bb9-8378-5c31c2d8d56f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2215.692320] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2215.692485] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2215.693411] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7287ce1d-8c1d-4e66-888a-d3d546c6cee2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2215.698060] env[68617]: DEBUG oslo_vmware.api [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Waiting for the task: (returnval){ [ 2215.698060] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]520c8990-7282-4a86-bf57-2709f2075cea" [ 2215.698060] env[68617]: _type = "Task" [ 2215.698060] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2215.709144] env[68617]: DEBUG oslo_vmware.api [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]520c8990-7282-4a86-bf57-2709f2075cea, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2215.759619] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2215.759835] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2215.760025] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Deleting the datastore file [datastore2] 902b5ab9-23b8-450f-853a-b2da889c3afd {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2215.760293] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b521d9e5-6fa5-4e78-b029-fd85e2caab45 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2215.766557] env[68617]: DEBUG oslo_vmware.api [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Waiting for the task: (returnval){ [ 2215.766557] env[68617]: value = "task-3470903" [ 2215.766557] env[68617]: _type = "Task" [ 2215.766557] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2215.774189] env[68617]: DEBUG oslo_vmware.api [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Task: {'id': task-3470903, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2216.208058] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2216.208402] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Creating directory with path [datastore2] vmware_temp/ae92b564-45cf-4ba4-ab95-50b81d7ee2c8/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2216.208506] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0e684119-2254-4e58-8101-6cff69033b4d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2216.219373] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Created directory with path [datastore2] vmware_temp/ae92b564-45cf-4ba4-ab95-50b81d7ee2c8/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2216.219558] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Fetch image to [datastore2] vmware_temp/ae92b564-45cf-4ba4-ab95-50b81d7ee2c8/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2216.219726] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/ae92b564-45cf-4ba4-ab95-50b81d7ee2c8/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2216.220421] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fa00eaa-6891-4ad2-bef6-8f75dc2d82fa {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2216.227247] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5712e64d-ad9a-4e64-82d6-29fd80ec749a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2216.235921] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c854cc90-c767-4a75-ae95-db30ce10cf1c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2216.270365] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4177552-6adc-41ea-b96e-459dec73cb08 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2216.276951] env[68617]: DEBUG oslo_vmware.api [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Task: {'id': task-3470903, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.080467} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2216.278405] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2216.278592] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2216.278761] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2216.278932] env[68617]: INFO nova.compute.manager [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2216.280929] env[68617]: DEBUG nova.compute.claims [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2216.281122] env[68617]: DEBUG oslo_concurrency.lockutils [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2216.281364] env[68617]: DEBUG oslo_concurrency.lockutils [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2216.284528] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-9b3dd04d-661c-46be-958e-f22fa64aa250 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2216.305497] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2216.381476] env[68617]: DEBUG oslo_vmware.rw_handles [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ae92b564-45cf-4ba4-ab95-50b81d7ee2c8/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2216.441419] env[68617]: DEBUG oslo_vmware.rw_handles [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2216.441670] env[68617]: DEBUG oslo_vmware.rw_handles [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ae92b564-45cf-4ba4-ab95-50b81d7ee2c8/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2216.486990] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b04cec77-a23b-4cc4-96d0-660641aa0fd1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2216.494991] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-232a7eaf-fd04-4739-82a3-29386f3d8f97 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2216.524829] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f32cf8e5-000f-48b1-9f7c-c9c435a03808 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2216.532014] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91ab3d11-7bdc-460f-baf1-cf636023459d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2216.544838] env[68617]: DEBUG nova.compute.provider_tree [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2216.553329] env[68617]: DEBUG nova.scheduler.client.report [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2216.568252] env[68617]: DEBUG oslo_concurrency.lockutils [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.287s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2216.568816] env[68617]: ERROR nova.compute.manager [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2216.568816] env[68617]: Faults: ['InvalidArgument'] [ 2216.568816] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Traceback (most recent call last): [ 2216.568816] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2216.568816] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] self.driver.spawn(context, instance, image_meta, [ 2216.568816] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2216.568816] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2216.568816] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2216.568816] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] self._fetch_image_if_missing(context, vi) [ 2216.568816] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2216.568816] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] image_cache(vi, tmp_image_ds_loc) [ 2216.568816] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2216.569175] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] vm_util.copy_virtual_disk( [ 2216.569175] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2216.569175] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] session._wait_for_task(vmdk_copy_task) [ 2216.569175] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2216.569175] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] return self.wait_for_task(task_ref) [ 2216.569175] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2216.569175] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] return evt.wait() [ 2216.569175] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2216.569175] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] result = hub.switch() [ 2216.569175] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2216.569175] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] return self.greenlet.switch() [ 2216.569175] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2216.569175] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] self.f(*self.args, **self.kw) [ 2216.569540] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2216.569540] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] raise exceptions.translate_fault(task_info.error) [ 2216.569540] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2216.569540] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Faults: ['InvalidArgument'] [ 2216.569540] env[68617]: ERROR nova.compute.manager [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] [ 2216.569540] env[68617]: DEBUG nova.compute.utils [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2216.570863] env[68617]: DEBUG nova.compute.manager [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Build of instance 902b5ab9-23b8-450f-853a-b2da889c3afd was re-scheduled: A specified parameter was not correct: fileType [ 2216.570863] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2216.571349] env[68617]: DEBUG nova.compute.manager [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2216.571518] env[68617]: DEBUG nova.compute.manager [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2216.571706] env[68617]: DEBUG nova.compute.manager [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2216.571898] env[68617]: DEBUG nova.network.neutron [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2216.870911] env[68617]: DEBUG nova.network.neutron [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2216.882471] env[68617]: INFO nova.compute.manager [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Took 0.31 seconds to deallocate network for instance. [ 2216.966547] env[68617]: INFO nova.scheduler.client.report [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Deleted allocations for instance 902b5ab9-23b8-450f-853a-b2da889c3afd [ 2216.986407] env[68617]: DEBUG oslo_concurrency.lockutils [None req-858d0d76-2ced-4eeb-965f-f9d7993cd74a tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Lock "902b5ab9-23b8-450f-853a-b2da889c3afd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 654.747s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2216.986670] env[68617]: DEBUG oslo_concurrency.lockutils [None req-054eb420-90e6-46b5-993a-5914b3863296 tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Lock "902b5ab9-23b8-450f-853a-b2da889c3afd" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 458.375s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2216.986884] env[68617]: DEBUG oslo_concurrency.lockutils [None req-054eb420-90e6-46b5-993a-5914b3863296 tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Acquiring lock "902b5ab9-23b8-450f-853a-b2da889c3afd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2216.987276] env[68617]: DEBUG oslo_concurrency.lockutils [None req-054eb420-90e6-46b5-993a-5914b3863296 tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Lock "902b5ab9-23b8-450f-853a-b2da889c3afd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2216.987276] env[68617]: DEBUG oslo_concurrency.lockutils [None req-054eb420-90e6-46b5-993a-5914b3863296 tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Lock "902b5ab9-23b8-450f-853a-b2da889c3afd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2216.989141] env[68617]: INFO nova.compute.manager [None req-054eb420-90e6-46b5-993a-5914b3863296 tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Terminating instance [ 2216.990792] env[68617]: DEBUG nova.compute.manager [None req-054eb420-90e6-46b5-993a-5914b3863296 tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2216.990988] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-054eb420-90e6-46b5-993a-5914b3863296 tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2216.991515] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5a1a5e2b-e7c4-45ce-9664-c233e264ed52 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2217.000383] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3f05110-9ba8-441d-8d1f-4ca77f929705 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2217.027283] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-054eb420-90e6-46b5-993a-5914b3863296 tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 902b5ab9-23b8-450f-853a-b2da889c3afd could not be found. [ 2217.027474] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-054eb420-90e6-46b5-993a-5914b3863296 tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2217.027651] env[68617]: INFO nova.compute.manager [None req-054eb420-90e6-46b5-993a-5914b3863296 tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2217.027887] env[68617]: DEBUG oslo.service.loopingcall [None req-054eb420-90e6-46b5-993a-5914b3863296 tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2217.028120] env[68617]: DEBUG nova.compute.manager [-] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2217.028217] env[68617]: DEBUG nova.network.neutron [-] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2217.050459] env[68617]: DEBUG nova.network.neutron [-] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2217.058598] env[68617]: INFO nova.compute.manager [-] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] Took 0.03 seconds to deallocate network for instance. [ 2217.161527] env[68617]: DEBUG oslo_concurrency.lockutils [None req-054eb420-90e6-46b5-993a-5914b3863296 tempest-ServerRescueTestJSONUnderV235-1923429183 tempest-ServerRescueTestJSONUnderV235-1923429183-project-member] Lock "902b5ab9-23b8-450f-853a-b2da889c3afd" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.174s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2217.161820] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "902b5ab9-23b8-450f-853a-b2da889c3afd" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 313.512s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2217.162014] env[68617]: INFO nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 902b5ab9-23b8-450f-853a-b2da889c3afd] During sync_power_state the instance has a pending task (deleting). Skip. [ 2217.162211] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "902b5ab9-23b8-450f-853a-b2da889c3afd" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2219.694243] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2221.699405] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2221.699740] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Starting heal instance info cache {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2221.699740] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Rebuilding the list of instances to heal {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2221.716721] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2221.716881] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2221.717016] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2221.717176] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2221.717305] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2221.717449] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2221.717574] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2221.717696] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Didn't find any instances for network info cache update. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2224.699103] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2224.699608] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2224.699608] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2224.699772] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2224.699926] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68617) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2226.700303] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2229.694667] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2229.698292] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2229.698486] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2229.712203] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2229.712387] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2229.712553] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2229.712740] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2229.713861] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1b6f58e-e788-4d2c-ab31-11eab80c260d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2229.723455] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a45d6825-4ce6-4641-9b66-9550a012176b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2229.736888] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90ecdea2-9a60-49ad-8dd4-25e87c22b6a2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2229.742870] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2962860c-c37f-4be3-833e-0e14bc96cd69 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2229.770564] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180897MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2229.770699] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2229.770881] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2229.830046] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 922c8926-c636-4463-85d6-4f2a6325b85a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2229.830210] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b1a8dc60-af98-4f80-96cf-b2550ea8c13a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2229.830369] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance a4ab788d-327a-47cc-8ae7-e1b9be889759 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2229.830496] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 17bb8415-dafd-47ed-9a14-52163ba5e7db actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2229.830620] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 82f72313-f493-4acd-a95e-765feb74a358 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2229.830738] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 797b434e-a913-43dc-a1df-39fe82da1221 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2229.830877] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6a1aa3fb-f182-4b9d-8add-7dfc70472be8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2229.831069] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2229.831206] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2229.910960] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42a897f6-04e5-4b6c-bab9-ab8ece86b7dc {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2229.918725] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-372bf65c-78fa-40c9-8fde-a46f69cd4bae {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2229.949204] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32c73a98-ec85-44e6-80a6-22bfc0cff0a2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2229.957689] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e19b447a-2b57-434b-8a5d-f34bb1d98854 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2229.970411] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2229.978346] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2229.991777] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2229.991962] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.221s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2265.168789] env[68617]: WARNING oslo_vmware.rw_handles [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2265.168789] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2265.168789] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2265.168789] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2265.168789] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2265.168789] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 2265.168789] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2265.168789] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2265.168789] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2265.168789] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2265.168789] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2265.168789] env[68617]: ERROR oslo_vmware.rw_handles [ 2265.169565] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/ae92b564-45cf-4ba4-ab95-50b81d7ee2c8/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2265.171249] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2265.171487] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Copying Virtual Disk [datastore2] vmware_temp/ae92b564-45cf-4ba4-ab95-50b81d7ee2c8/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/ae92b564-45cf-4ba4-ab95-50b81d7ee2c8/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2265.171769] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-41c18518-cf2f-41de-a27f-59984500ca9d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2265.179560] env[68617]: DEBUG oslo_vmware.api [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Waiting for the task: (returnval){ [ 2265.179560] env[68617]: value = "task-3470904" [ 2265.179560] env[68617]: _type = "Task" [ 2265.179560] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2265.187671] env[68617]: DEBUG oslo_vmware.api [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Task: {'id': task-3470904, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2265.689932] env[68617]: DEBUG oslo_vmware.exceptions [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2265.690248] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2265.690795] env[68617]: ERROR nova.compute.manager [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2265.690795] env[68617]: Faults: ['InvalidArgument'] [ 2265.690795] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Traceback (most recent call last): [ 2265.690795] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2265.690795] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] yield resources [ 2265.690795] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2265.690795] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] self.driver.spawn(context, instance, image_meta, [ 2265.690795] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2265.690795] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2265.690795] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2265.690795] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] self._fetch_image_if_missing(context, vi) [ 2265.690795] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2265.691161] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] image_cache(vi, tmp_image_ds_loc) [ 2265.691161] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2265.691161] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] vm_util.copy_virtual_disk( [ 2265.691161] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2265.691161] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] session._wait_for_task(vmdk_copy_task) [ 2265.691161] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2265.691161] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] return self.wait_for_task(task_ref) [ 2265.691161] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2265.691161] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] return evt.wait() [ 2265.691161] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2265.691161] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] result = hub.switch() [ 2265.691161] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2265.691161] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] return self.greenlet.switch() [ 2265.691503] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2265.691503] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] self.f(*self.args, **self.kw) [ 2265.691503] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2265.691503] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] raise exceptions.translate_fault(task_info.error) [ 2265.691503] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2265.691503] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Faults: ['InvalidArgument'] [ 2265.691503] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] [ 2265.691503] env[68617]: INFO nova.compute.manager [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Terminating instance [ 2265.692642] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2265.692841] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2265.693107] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-10d80735-78c6-4af5-94ad-1f5601cfc3cb {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2265.695373] env[68617]: DEBUG nova.compute.manager [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2265.695557] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2265.696256] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8528e0bb-8fb5-4956-8eb2-f10e10705532 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2265.702454] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2265.702805] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-777d68fb-849a-40cf-b8b9-e56fc0ab3823 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2265.704687] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2265.704849] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2265.705784] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e470ce3e-4985-496c-8e88-eb9faf0fd4cb {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2265.710285] env[68617]: DEBUG oslo_vmware.api [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Waiting for the task: (returnval){ [ 2265.710285] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52fd00c8-ab87-d5f5-a418-54719923095f" [ 2265.710285] env[68617]: _type = "Task" [ 2265.710285] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2265.724669] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2265.724871] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Creating directory with path [datastore2] vmware_temp/4910fa75-9c14-47a1-ab37-b22fd375255c/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2265.725079] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7c11dc59-b181-4e0d-b04e-73ccc6fd57fa {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2265.744611] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Created directory with path [datastore2] vmware_temp/4910fa75-9c14-47a1-ab37-b22fd375255c/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2265.744787] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Fetch image to [datastore2] vmware_temp/4910fa75-9c14-47a1-ab37-b22fd375255c/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2265.744954] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/4910fa75-9c14-47a1-ab37-b22fd375255c/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2265.745676] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58ade19d-ae26-476c-aca2-be9e04025015 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2265.751992] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd7f9de0-47fb-4179-8154-e4e84243d619 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2265.761811] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63903575-0cdf-4e0e-91fd-997e72e6ed8b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2265.794163] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2573f6d1-bb40-4ec0-bbac-63aecd01109c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2265.796752] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2265.796946] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2265.797137] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Deleting the datastore file [datastore2] 922c8926-c636-4463-85d6-4f2a6325b85a {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2265.797359] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-fc957ce5-2b1d-4b12-946f-416b1a3ecb4c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2265.803379] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-46258a40-ea73-4146-93d1-b980c28eeb56 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2265.804983] env[68617]: DEBUG oslo_vmware.api [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Waiting for the task: (returnval){ [ 2265.804983] env[68617]: value = "task-3470906" [ 2265.804983] env[68617]: _type = "Task" [ 2265.804983] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2265.811941] env[68617]: DEBUG oslo_vmware.api [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Task: {'id': task-3470906, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2265.824608] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2265.878250] env[68617]: DEBUG oslo_vmware.rw_handles [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4910fa75-9c14-47a1-ab37-b22fd375255c/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2265.937966] env[68617]: DEBUG oslo_vmware.rw_handles [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2265.938190] env[68617]: DEBUG oslo_vmware.rw_handles [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4910fa75-9c14-47a1-ab37-b22fd375255c/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2266.315418] env[68617]: DEBUG oslo_vmware.api [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Task: {'id': task-3470906, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067241} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2266.315736] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2266.315771] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2266.315938] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2266.316137] env[68617]: INFO nova.compute.manager [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Took 0.62 seconds to destroy the instance on the hypervisor. [ 2266.318179] env[68617]: DEBUG nova.compute.claims [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2266.318351] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2266.318563] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2266.442631] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e5c21be-f941-4cfb-ae37-c29b50fef2f4 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2266.449660] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6fa5abee-ab1f-47f7-98fd-0b83d3d72225 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2266.479852] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3febcca-0529-47d2-86b6-8a338a97ede8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2266.486287] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8e8fc7c-dd90-4436-a975-ec7b84a2a368 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2266.498888] env[68617]: DEBUG nova.compute.provider_tree [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2266.509935] env[68617]: DEBUG nova.scheduler.client.report [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2266.522516] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.204s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2266.523025] env[68617]: ERROR nova.compute.manager [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2266.523025] env[68617]: Faults: ['InvalidArgument'] [ 2266.523025] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Traceback (most recent call last): [ 2266.523025] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2266.523025] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] self.driver.spawn(context, instance, image_meta, [ 2266.523025] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2266.523025] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2266.523025] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2266.523025] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] self._fetch_image_if_missing(context, vi) [ 2266.523025] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2266.523025] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] image_cache(vi, tmp_image_ds_loc) [ 2266.523025] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2266.523414] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] vm_util.copy_virtual_disk( [ 2266.523414] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2266.523414] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] session._wait_for_task(vmdk_copy_task) [ 2266.523414] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2266.523414] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] return self.wait_for_task(task_ref) [ 2266.523414] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2266.523414] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] return evt.wait() [ 2266.523414] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2266.523414] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] result = hub.switch() [ 2266.523414] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2266.523414] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] return self.greenlet.switch() [ 2266.523414] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2266.523414] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] self.f(*self.args, **self.kw) [ 2266.523765] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2266.523765] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] raise exceptions.translate_fault(task_info.error) [ 2266.523765] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2266.523765] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Faults: ['InvalidArgument'] [ 2266.523765] env[68617]: ERROR nova.compute.manager [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] [ 2266.523765] env[68617]: DEBUG nova.compute.utils [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2266.525013] env[68617]: DEBUG nova.compute.manager [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Build of instance 922c8926-c636-4463-85d6-4f2a6325b85a was re-scheduled: A specified parameter was not correct: fileType [ 2266.525013] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2266.525396] env[68617]: DEBUG nova.compute.manager [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2266.525565] env[68617]: DEBUG nova.compute.manager [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2266.525732] env[68617]: DEBUG nova.compute.manager [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2266.525889] env[68617]: DEBUG nova.network.neutron [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2266.830523] env[68617]: DEBUG nova.network.neutron [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2266.846711] env[68617]: INFO nova.compute.manager [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Took 0.32 seconds to deallocate network for instance. [ 2266.944376] env[68617]: INFO nova.scheduler.client.report [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Deleted allocations for instance 922c8926-c636-4463-85d6-4f2a6325b85a [ 2266.967325] env[68617]: DEBUG oslo_concurrency.lockutils [None req-e1d3bb92-31fd-49e5-b9ff-71b51f2c596a tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Lock "922c8926-c636-4463-85d6-4f2a6325b85a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 677.358s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2266.967608] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d257f5f5-2eb5-4c3e-b15c-007d64111732 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Lock "922c8926-c636-4463-85d6-4f2a6325b85a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 481.569s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2266.967835] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d257f5f5-2eb5-4c3e-b15c-007d64111732 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Acquiring lock "922c8926-c636-4463-85d6-4f2a6325b85a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2266.968068] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d257f5f5-2eb5-4c3e-b15c-007d64111732 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Lock "922c8926-c636-4463-85d6-4f2a6325b85a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2266.968248] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d257f5f5-2eb5-4c3e-b15c-007d64111732 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Lock "922c8926-c636-4463-85d6-4f2a6325b85a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2266.970673] env[68617]: INFO nova.compute.manager [None req-d257f5f5-2eb5-4c3e-b15c-007d64111732 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Terminating instance [ 2266.971780] env[68617]: DEBUG nova.compute.manager [None req-d257f5f5-2eb5-4c3e-b15c-007d64111732 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2266.972039] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d257f5f5-2eb5-4c3e-b15c-007d64111732 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2266.972523] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-282c2d77-4e2e-421f-815c-f3f9e55a2ce8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2266.981388] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb16d3a9-406c-4506-afca-868a27a09de3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2267.007623] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-d257f5f5-2eb5-4c3e-b15c-007d64111732 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 922c8926-c636-4463-85d6-4f2a6325b85a could not be found. [ 2267.008668] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d257f5f5-2eb5-4c3e-b15c-007d64111732 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2267.008668] env[68617]: INFO nova.compute.manager [None req-d257f5f5-2eb5-4c3e-b15c-007d64111732 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2267.008668] env[68617]: DEBUG oslo.service.loopingcall [None req-d257f5f5-2eb5-4c3e-b15c-007d64111732 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2267.008855] env[68617]: DEBUG nova.compute.manager [-] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2267.008855] env[68617]: DEBUG nova.network.neutron [-] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2267.032625] env[68617]: DEBUG nova.network.neutron [-] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2267.040596] env[68617]: INFO nova.compute.manager [-] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] Took 0.03 seconds to deallocate network for instance. [ 2267.130117] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d257f5f5-2eb5-4c3e-b15c-007d64111732 tempest-AttachVolumeShelveTestJSON-1744895665 tempest-AttachVolumeShelveTestJSON-1744895665-project-member] Lock "922c8926-c636-4463-85d6-4f2a6325b85a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.162s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2267.131286] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "922c8926-c636-4463-85d6-4f2a6325b85a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 363.480s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2267.131286] env[68617]: INFO nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 922c8926-c636-4463-85d6-4f2a6325b85a] During sync_power_state the instance has a pending task (deleting). Skip. [ 2267.131286] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "922c8926-c636-4463-85d6-4f2a6325b85a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2282.993773] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2282.994209] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Starting heal instance info cache {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2282.994209] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Rebuilding the list of instances to heal {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2283.010993] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2283.011157] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2283.011288] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2283.011413] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2283.011534] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2283.011652] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2283.011772] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Didn't find any instances for network info cache update. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2284.660557] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d87b6b82-9ddf-4183-bf0b-1e00c4ff752d tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "6a1aa3fb-f182-4b9d-8add-7dfc70472be8" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2284.698457] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2284.698665] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68617) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2285.699364] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2285.699700] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2285.699759] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2288.701275] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2289.698987] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2289.710555] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2289.710872] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2289.710906] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2289.711082] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2289.712194] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-035dc1bd-feef-477f-b7e1-02acd2a503c5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.721067] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d127dd2-a250-4d43-8b22-ec4eab60d6a5 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.734743] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-526eb5a8-41bd-4ef1-8dde-90df6de08b41 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.740653] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e758252-8ed1-4602-a8cd-d7a2becb9752 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.770064] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180892MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2289.770203] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2289.770409] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2289.829550] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance b1a8dc60-af98-4f80-96cf-b2550ea8c13a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2289.829712] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance a4ab788d-327a-47cc-8ae7-e1b9be889759 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2289.829841] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 17bb8415-dafd-47ed-9a14-52163ba5e7db actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2289.829964] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 82f72313-f493-4acd-a95e-765feb74a358 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2289.830098] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 797b434e-a913-43dc-a1df-39fe82da1221 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2289.830217] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6a1aa3fb-f182-4b9d-8add-7dfc70472be8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2289.830416] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2289.830567] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2289.910087] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-620649f4-b53c-42ca-bd36-decf272f120b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.917659] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6b0fa0a-eca7-4a87-9a41-0a949c463ee0 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.946765] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fadcd550-db8a-42e4-9a39-4731d3f339dc {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.953717] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07d11887-86dd-47a6-95e8-15ba0f96c9de {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.966542] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2289.974510] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2289.988634] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2289.988822] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.218s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2290.984625] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2291.699485] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2314.302687] env[68617]: WARNING oslo_vmware.rw_handles [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2314.302687] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2314.302687] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2314.302687] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2314.302687] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2314.302687] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 2314.302687] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2314.302687] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2314.302687] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2314.302687] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2314.302687] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2314.302687] env[68617]: ERROR oslo_vmware.rw_handles [ 2314.303514] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/4910fa75-9c14-47a1-ab37-b22fd375255c/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2314.305265] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2314.305507] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Copying Virtual Disk [datastore2] vmware_temp/4910fa75-9c14-47a1-ab37-b22fd375255c/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/4910fa75-9c14-47a1-ab37-b22fd375255c/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2314.305792] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ddadd87d-c6aa-4410-9f69-364bc0024b61 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2314.316167] env[68617]: DEBUG oslo_vmware.api [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Waiting for the task: (returnval){ [ 2314.316167] env[68617]: value = "task-3470907" [ 2314.316167] env[68617]: _type = "Task" [ 2314.316167] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2314.323600] env[68617]: DEBUG oslo_vmware.api [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Task: {'id': task-3470907, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2314.826733] env[68617]: DEBUG oslo_vmware.exceptions [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2314.827012] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2314.827596] env[68617]: ERROR nova.compute.manager [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2314.827596] env[68617]: Faults: ['InvalidArgument'] [ 2314.827596] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Traceback (most recent call last): [ 2314.827596] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2314.827596] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] yield resources [ 2314.827596] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2314.827596] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] self.driver.spawn(context, instance, image_meta, [ 2314.827596] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2314.827596] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2314.827596] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2314.827596] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] self._fetch_image_if_missing(context, vi) [ 2314.827596] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2314.827915] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] image_cache(vi, tmp_image_ds_loc) [ 2314.827915] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2314.827915] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] vm_util.copy_virtual_disk( [ 2314.827915] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2314.827915] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] session._wait_for_task(vmdk_copy_task) [ 2314.827915] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2314.827915] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] return self.wait_for_task(task_ref) [ 2314.827915] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2314.827915] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] return evt.wait() [ 2314.827915] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2314.827915] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] result = hub.switch() [ 2314.827915] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2314.827915] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] return self.greenlet.switch() [ 2314.828483] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2314.828483] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] self.f(*self.args, **self.kw) [ 2314.828483] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2314.828483] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] raise exceptions.translate_fault(task_info.error) [ 2314.828483] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2314.828483] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Faults: ['InvalidArgument'] [ 2314.828483] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] [ 2314.828483] env[68617]: INFO nova.compute.manager [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Terminating instance [ 2314.830073] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2314.830073] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2314.830073] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1cb5f6c8-290f-41e0-9aac-346d0b62dc80 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2314.832112] env[68617]: DEBUG nova.compute.manager [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2314.832306] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2314.833062] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8df1ff78-d5d4-4c03-95ae-5ec79bbc8c13 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2314.840562] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2314.841592] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-68b83745-a220-4b5e-9660-af004a3bd143 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2314.842992] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2314.843174] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2314.843855] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-00a807d0-7b2f-4677-b164-5164b3a22cf9 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2314.848627] env[68617]: DEBUG oslo_vmware.api [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Waiting for the task: (returnval){ [ 2314.848627] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52b129ee-edb7-3bb4-1fc3-61cb84595ad3" [ 2314.848627] env[68617]: _type = "Task" [ 2314.848627] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2314.855662] env[68617]: DEBUG oslo_vmware.api [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52b129ee-edb7-3bb4-1fc3-61cb84595ad3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2314.908530] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2314.908736] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2314.908912] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Deleting the datastore file [datastore2] b1a8dc60-af98-4f80-96cf-b2550ea8c13a {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2314.909198] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-aea45169-ddda-4f15-8573-7538426a9a35 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2314.915102] env[68617]: DEBUG oslo_vmware.api [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Waiting for the task: (returnval){ [ 2314.915102] env[68617]: value = "task-3470909" [ 2314.915102] env[68617]: _type = "Task" [ 2314.915102] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2314.922352] env[68617]: DEBUG oslo_vmware.api [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Task: {'id': task-3470909, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2315.359668] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2315.359989] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Creating directory with path [datastore2] vmware_temp/8f09f0ea-71f7-4445-af40-75d9ca69916d/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2315.360187] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d3c2a0fc-f706-403e-9e01-dd5bf7799882 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2315.371636] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Created directory with path [datastore2] vmware_temp/8f09f0ea-71f7-4445-af40-75d9ca69916d/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2315.371829] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Fetch image to [datastore2] vmware_temp/8f09f0ea-71f7-4445-af40-75d9ca69916d/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2315.371980] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/8f09f0ea-71f7-4445-af40-75d9ca69916d/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2315.372713] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce90d113-4f68-4aab-a689-26f91fe73ec1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2315.379152] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9638bf37-5f5c-4094-86cb-920ff604cda9 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2315.388075] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2752c603-55f3-4c03-8bdb-0861d5a076ae {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2315.423503] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b8a4d20-f7ba-48aa-a889-0c1d2e1b163e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2315.430809] env[68617]: DEBUG oslo_vmware.api [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Task: {'id': task-3470909, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.084466} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2315.432308] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2315.432527] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2315.432742] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2315.432931] env[68617]: INFO nova.compute.manager [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2315.434752] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5b7ce2d1-2500-443e-8994-56503f5e6a53 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2315.436752] env[68617]: DEBUG nova.compute.claims [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2315.436964] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2315.437169] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2315.462359] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2315.564092] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21ab6438-1245-496b-9b12-dd3f85fb0790 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2315.574798] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68391b45-7e3c-476b-ab61-70c57dacc52e {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2315.606082] env[68617]: DEBUG oslo_vmware.rw_handles [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8f09f0ea-71f7-4445-af40-75d9ca69916d/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2315.607930] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fee45ce2-58ab-4e22-8c55-1e5afeb0a444 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2315.669681] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93f8bc7f-48e7-4839-93a2-544316f87d34 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2315.673924] env[68617]: DEBUG oslo_vmware.rw_handles [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2315.674112] env[68617]: DEBUG oslo_vmware.rw_handles [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8f09f0ea-71f7-4445-af40-75d9ca69916d/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2315.684297] env[68617]: DEBUG nova.compute.provider_tree [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2315.694165] env[68617]: DEBUG nova.scheduler.client.report [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2315.707290] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.270s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2315.707790] env[68617]: ERROR nova.compute.manager [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2315.707790] env[68617]: Faults: ['InvalidArgument'] [ 2315.707790] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Traceback (most recent call last): [ 2315.707790] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2315.707790] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] self.driver.spawn(context, instance, image_meta, [ 2315.707790] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2315.707790] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2315.707790] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2315.707790] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] self._fetch_image_if_missing(context, vi) [ 2315.707790] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2315.707790] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] image_cache(vi, tmp_image_ds_loc) [ 2315.707790] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2315.708138] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] vm_util.copy_virtual_disk( [ 2315.708138] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2315.708138] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] session._wait_for_task(vmdk_copy_task) [ 2315.708138] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2315.708138] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] return self.wait_for_task(task_ref) [ 2315.708138] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2315.708138] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] return evt.wait() [ 2315.708138] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2315.708138] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] result = hub.switch() [ 2315.708138] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2315.708138] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] return self.greenlet.switch() [ 2315.708138] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2315.708138] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] self.f(*self.args, **self.kw) [ 2315.708495] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2315.708495] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] raise exceptions.translate_fault(task_info.error) [ 2315.708495] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2315.708495] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Faults: ['InvalidArgument'] [ 2315.708495] env[68617]: ERROR nova.compute.manager [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] [ 2315.708495] env[68617]: DEBUG nova.compute.utils [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2315.709783] env[68617]: DEBUG nova.compute.manager [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Build of instance b1a8dc60-af98-4f80-96cf-b2550ea8c13a was re-scheduled: A specified parameter was not correct: fileType [ 2315.709783] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2315.710166] env[68617]: DEBUG nova.compute.manager [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2315.710340] env[68617]: DEBUG nova.compute.manager [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2315.710506] env[68617]: DEBUG nova.compute.manager [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2315.710707] env[68617]: DEBUG nova.network.neutron [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2316.011739] env[68617]: DEBUG nova.network.neutron [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2316.022394] env[68617]: INFO nova.compute.manager [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Took 0.31 seconds to deallocate network for instance. [ 2316.114808] env[68617]: INFO nova.scheduler.client.report [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Deleted allocations for instance b1a8dc60-af98-4f80-96cf-b2550ea8c13a [ 2316.138918] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c6837375-c34a-460e-ab35-836a76056e31 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "b1a8dc60-af98-4f80-96cf-b2550ea8c13a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 621.103s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2316.139169] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ea2039ab-4bbd-4b5a-af4d-095d8a0fa125 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "b1a8dc60-af98-4f80-96cf-b2550ea8c13a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 424.317s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2316.139467] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ea2039ab-4bbd-4b5a-af4d-095d8a0fa125 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Acquiring lock "b1a8dc60-af98-4f80-96cf-b2550ea8c13a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2316.139720] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ea2039ab-4bbd-4b5a-af4d-095d8a0fa125 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "b1a8dc60-af98-4f80-96cf-b2550ea8c13a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2316.139936] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ea2039ab-4bbd-4b5a-af4d-095d8a0fa125 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "b1a8dc60-af98-4f80-96cf-b2550ea8c13a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2316.141905] env[68617]: INFO nova.compute.manager [None req-ea2039ab-4bbd-4b5a-af4d-095d8a0fa125 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Terminating instance [ 2316.143647] env[68617]: DEBUG nova.compute.manager [None req-ea2039ab-4bbd-4b5a-af4d-095d8a0fa125 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2316.143858] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ea2039ab-4bbd-4b5a-af4d-095d8a0fa125 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2316.144392] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6830d253-2a1c-4009-83b5-17829315dddc {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2316.153592] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4c49a4e-1786-4619-9c40-a2f1acc433f4 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2316.179453] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-ea2039ab-4bbd-4b5a-af4d-095d8a0fa125 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b1a8dc60-af98-4f80-96cf-b2550ea8c13a could not be found. [ 2316.179749] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ea2039ab-4bbd-4b5a-af4d-095d8a0fa125 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2316.179828] env[68617]: INFO nova.compute.manager [None req-ea2039ab-4bbd-4b5a-af4d-095d8a0fa125 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2316.180075] env[68617]: DEBUG oslo.service.loopingcall [None req-ea2039ab-4bbd-4b5a-af4d-095d8a0fa125 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2316.180532] env[68617]: DEBUG nova.compute.manager [-] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2316.180674] env[68617]: DEBUG nova.network.neutron [-] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2316.207535] env[68617]: DEBUG nova.network.neutron [-] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2316.215555] env[68617]: INFO nova.compute.manager [-] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] Took 0.03 seconds to deallocate network for instance. [ 2316.302978] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ea2039ab-4bbd-4b5a-af4d-095d8a0fa125 tempest-ServerDiskConfigTestJSON-797669125 tempest-ServerDiskConfigTestJSON-797669125-project-member] Lock "b1a8dc60-af98-4f80-96cf-b2550ea8c13a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.164s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2316.303862] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "b1a8dc60-af98-4f80-96cf-b2550ea8c13a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 412.653s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2316.304212] env[68617]: INFO nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: b1a8dc60-af98-4f80-96cf-b2550ea8c13a] During sync_power_state the instance has a pending task (deleting). Skip. [ 2316.304291] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "b1a8dc60-af98-4f80-96cf-b2550ea8c13a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2325.139792] env[68617]: DEBUG oslo_concurrency.lockutils [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "2edb4d02-dec4-4e7d-9c57-6b2b147740ad" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2325.140131] env[68617]: DEBUG oslo_concurrency.lockutils [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "2edb4d02-dec4-4e7d-9c57-6b2b147740ad" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2325.150564] env[68617]: DEBUG nova.compute.manager [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 2edb4d02-dec4-4e7d-9c57-6b2b147740ad] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2325.196757] env[68617]: DEBUG oslo_concurrency.lockutils [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2325.196998] env[68617]: DEBUG oslo_concurrency.lockutils [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2325.198408] env[68617]: INFO nova.compute.claims [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 2edb4d02-dec4-4e7d-9c57-6b2b147740ad] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2325.312891] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14660c9d-23db-47cb-81c7-8175f6674c72 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2325.320687] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67d76a9d-8a54-4995-959d-a8b38b3d5071 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2325.349391] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6d4432a-6d82-4819-a272-d29414d303f2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2325.355976] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa3200be-6f2e-45bb-bf43-4ecd203eba57 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2325.368501] env[68617]: DEBUG nova.compute.provider_tree [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2325.380032] env[68617]: DEBUG nova.scheduler.client.report [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2325.393300] env[68617]: DEBUG oslo_concurrency.lockutils [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.196s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2325.393781] env[68617]: DEBUG nova.compute.manager [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 2edb4d02-dec4-4e7d-9c57-6b2b147740ad] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2325.426089] env[68617]: DEBUG nova.compute.utils [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2325.427480] env[68617]: DEBUG nova.compute.manager [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 2edb4d02-dec4-4e7d-9c57-6b2b147740ad] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2325.427612] env[68617]: DEBUG nova.network.neutron [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 2edb4d02-dec4-4e7d-9c57-6b2b147740ad] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2325.447119] env[68617]: DEBUG nova.compute.manager [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 2edb4d02-dec4-4e7d-9c57-6b2b147740ad] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2325.497059] env[68617]: DEBUG nova.policy [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '11eecc8f059e410cb97bafaadc378f89', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4de7b27e9cf04c16b8dee80e756404fd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 2325.511303] env[68617]: DEBUG nova.compute.manager [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 2edb4d02-dec4-4e7d-9c57-6b2b147740ad] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2325.535679] env[68617]: DEBUG nova.virt.hardware [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2325.535927] env[68617]: DEBUG nova.virt.hardware [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2325.536103] env[68617]: DEBUG nova.virt.hardware [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2325.536288] env[68617]: DEBUG nova.virt.hardware [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2325.536434] env[68617]: DEBUG nova.virt.hardware [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2325.536578] env[68617]: DEBUG nova.virt.hardware [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2325.536817] env[68617]: DEBUG nova.virt.hardware [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2325.537036] env[68617]: DEBUG nova.virt.hardware [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2325.537219] env[68617]: DEBUG nova.virt.hardware [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2325.537388] env[68617]: DEBUG nova.virt.hardware [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2325.537563] env[68617]: DEBUG nova.virt.hardware [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2325.538404] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7a51d18-4a5d-4e83-8851-2229512c133d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2325.548046] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-752624fe-e67d-4f09-bbc3-bfea43478dfb {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2325.785870] env[68617]: DEBUG nova.network.neutron [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 2edb4d02-dec4-4e7d-9c57-6b2b147740ad] Successfully created port: e484ff03-c895-438e-b829-9436ff826bc4 {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2326.331338] env[68617]: DEBUG nova.compute.manager [req-2e5519fa-cc1c-446d-98cf-3182344c9e32 req-ab1b1fa8-1a4e-4937-acac-06b9e1cc3cc4 service nova] [instance: 2edb4d02-dec4-4e7d-9c57-6b2b147740ad] Received event network-vif-plugged-e484ff03-c895-438e-b829-9436ff826bc4 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2326.331338] env[68617]: DEBUG oslo_concurrency.lockutils [req-2e5519fa-cc1c-446d-98cf-3182344c9e32 req-ab1b1fa8-1a4e-4937-acac-06b9e1cc3cc4 service nova] Acquiring lock "2edb4d02-dec4-4e7d-9c57-6b2b147740ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2326.331338] env[68617]: DEBUG oslo_concurrency.lockutils [req-2e5519fa-cc1c-446d-98cf-3182344c9e32 req-ab1b1fa8-1a4e-4937-acac-06b9e1cc3cc4 service nova] Lock "2edb4d02-dec4-4e7d-9c57-6b2b147740ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2326.331338] env[68617]: DEBUG oslo_concurrency.lockutils [req-2e5519fa-cc1c-446d-98cf-3182344c9e32 req-ab1b1fa8-1a4e-4937-acac-06b9e1cc3cc4 service nova] Lock "2edb4d02-dec4-4e7d-9c57-6b2b147740ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2326.332048] env[68617]: DEBUG nova.compute.manager [req-2e5519fa-cc1c-446d-98cf-3182344c9e32 req-ab1b1fa8-1a4e-4937-acac-06b9e1cc3cc4 service nova] [instance: 2edb4d02-dec4-4e7d-9c57-6b2b147740ad] No waiting events found dispatching network-vif-plugged-e484ff03-c895-438e-b829-9436ff826bc4 {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2326.332730] env[68617]: WARNING nova.compute.manager [req-2e5519fa-cc1c-446d-98cf-3182344c9e32 req-ab1b1fa8-1a4e-4937-acac-06b9e1cc3cc4 service nova] [instance: 2edb4d02-dec4-4e7d-9c57-6b2b147740ad] Received unexpected event network-vif-plugged-e484ff03-c895-438e-b829-9436ff826bc4 for instance with vm_state building and task_state spawning. [ 2326.335811] env[68617]: DEBUG nova.network.neutron [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 2edb4d02-dec4-4e7d-9c57-6b2b147740ad] Successfully updated port: e484ff03-c895-438e-b829-9436ff826bc4 {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2326.347744] env[68617]: DEBUG oslo_concurrency.lockutils [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "refresh_cache-2edb4d02-dec4-4e7d-9c57-6b2b147740ad" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2326.348356] env[68617]: DEBUG oslo_concurrency.lockutils [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquired lock "refresh_cache-2edb4d02-dec4-4e7d-9c57-6b2b147740ad" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2326.348614] env[68617]: DEBUG nova.network.neutron [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 2edb4d02-dec4-4e7d-9c57-6b2b147740ad] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2326.385830] env[68617]: DEBUG nova.network.neutron [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 2edb4d02-dec4-4e7d-9c57-6b2b147740ad] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2326.540019] env[68617]: DEBUG nova.network.neutron [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 2edb4d02-dec4-4e7d-9c57-6b2b147740ad] Updating instance_info_cache with network_info: [{"id": "e484ff03-c895-438e-b829-9436ff826bc4", "address": "fa:16:3e:95:7e:20", "network": {"id": "ad29e76d-388b-42ca-9526-7b6c236321e3", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1855301645-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4de7b27e9cf04c16b8dee80e756404fd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8e272539-d425-489f-9a63-aba692e88933", "external-id": "nsx-vlan-transportzone-869", "segmentation_id": 869, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape484ff03-c8", "ovs_interfaceid": "e484ff03-c895-438e-b829-9436ff826bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2326.552125] env[68617]: DEBUG oslo_concurrency.lockutils [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Releasing lock "refresh_cache-2edb4d02-dec4-4e7d-9c57-6b2b147740ad" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2326.552414] env[68617]: DEBUG nova.compute.manager [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 2edb4d02-dec4-4e7d-9c57-6b2b147740ad] Instance network_info: |[{"id": "e484ff03-c895-438e-b829-9436ff826bc4", "address": "fa:16:3e:95:7e:20", "network": {"id": "ad29e76d-388b-42ca-9526-7b6c236321e3", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1855301645-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4de7b27e9cf04c16b8dee80e756404fd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8e272539-d425-489f-9a63-aba692e88933", "external-id": "nsx-vlan-transportzone-869", "segmentation_id": 869, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape484ff03-c8", "ovs_interfaceid": "e484ff03-c895-438e-b829-9436ff826bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2326.552823] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 2edb4d02-dec4-4e7d-9c57-6b2b147740ad] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:95:7e:20', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8e272539-d425-489f-9a63-aba692e88933', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e484ff03-c895-438e-b829-9436ff826bc4', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2326.560266] env[68617]: DEBUG oslo.service.loopingcall [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2326.560714] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2edb4d02-dec4-4e7d-9c57-6b2b147740ad] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2326.560950] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-368971fe-857f-46c9-b49e-25344cfb8efb {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2326.580473] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2326.580473] env[68617]: value = "task-3470910" [ 2326.580473] env[68617]: _type = "Task" [ 2326.580473] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2326.587804] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470910, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2327.090984] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470910, 'name': CreateVM_Task, 'duration_secs': 0.317334} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2327.091208] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2edb4d02-dec4-4e7d-9c57-6b2b147740ad] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2327.091849] env[68617]: DEBUG oslo_concurrency.lockutils [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2327.092018] env[68617]: DEBUG oslo_concurrency.lockutils [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2327.092335] env[68617]: DEBUG oslo_concurrency.lockutils [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2327.092575] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b2b0d16c-8053-4334-b283-a152ed563170 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2327.096805] env[68617]: DEBUG oslo_vmware.api [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Waiting for the task: (returnval){ [ 2327.096805] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]5282c5c3-34b0-1b40-3b11-7243c0a15afa" [ 2327.096805] env[68617]: _type = "Task" [ 2327.096805] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2327.104319] env[68617]: DEBUG oslo_vmware.api [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]5282c5c3-34b0-1b40-3b11-7243c0a15afa, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2327.607439] env[68617]: DEBUG oslo_concurrency.lockutils [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2327.607813] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 2edb4d02-dec4-4e7d-9c57-6b2b147740ad] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2327.607892] env[68617]: DEBUG oslo_concurrency.lockutils [None req-862a1856-bc51-4a9a-a6b8-caac4b841853 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2328.356569] env[68617]: DEBUG nova.compute.manager [req-ca374aad-dc4b-45e8-9700-1ce49633e06b req-613caadf-51d6-4270-affd-03c762938322 service nova] [instance: 2edb4d02-dec4-4e7d-9c57-6b2b147740ad] Received event network-changed-e484ff03-c895-438e-b829-9436ff826bc4 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2328.356715] env[68617]: DEBUG nova.compute.manager [req-ca374aad-dc4b-45e8-9700-1ce49633e06b req-613caadf-51d6-4270-affd-03c762938322 service nova] [instance: 2edb4d02-dec4-4e7d-9c57-6b2b147740ad] Refreshing instance network info cache due to event network-changed-e484ff03-c895-438e-b829-9436ff826bc4. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2328.356933] env[68617]: DEBUG oslo_concurrency.lockutils [req-ca374aad-dc4b-45e8-9700-1ce49633e06b req-613caadf-51d6-4270-affd-03c762938322 service nova] Acquiring lock "refresh_cache-2edb4d02-dec4-4e7d-9c57-6b2b147740ad" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2328.357067] env[68617]: DEBUG oslo_concurrency.lockutils [req-ca374aad-dc4b-45e8-9700-1ce49633e06b req-613caadf-51d6-4270-affd-03c762938322 service nova] Acquired lock "refresh_cache-2edb4d02-dec4-4e7d-9c57-6b2b147740ad" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2328.357225] env[68617]: DEBUG nova.network.neutron [req-ca374aad-dc4b-45e8-9700-1ce49633e06b req-613caadf-51d6-4270-affd-03c762938322 service nova] [instance: 2edb4d02-dec4-4e7d-9c57-6b2b147740ad] Refreshing network info cache for port e484ff03-c895-438e-b829-9436ff826bc4 {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2328.759509] env[68617]: DEBUG nova.network.neutron [req-ca374aad-dc4b-45e8-9700-1ce49633e06b req-613caadf-51d6-4270-affd-03c762938322 service nova] [instance: 2edb4d02-dec4-4e7d-9c57-6b2b147740ad] Updated VIF entry in instance network info cache for port e484ff03-c895-438e-b829-9436ff826bc4. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2328.759873] env[68617]: DEBUG nova.network.neutron [req-ca374aad-dc4b-45e8-9700-1ce49633e06b req-613caadf-51d6-4270-affd-03c762938322 service nova] [instance: 2edb4d02-dec4-4e7d-9c57-6b2b147740ad] Updating instance_info_cache with network_info: [{"id": "e484ff03-c895-438e-b829-9436ff826bc4", "address": "fa:16:3e:95:7e:20", "network": {"id": "ad29e76d-388b-42ca-9526-7b6c236321e3", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1855301645-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4de7b27e9cf04c16b8dee80e756404fd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8e272539-d425-489f-9a63-aba692e88933", "external-id": "nsx-vlan-transportzone-869", "segmentation_id": 869, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape484ff03-c8", "ovs_interfaceid": "e484ff03-c895-438e-b829-9436ff826bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2328.768751] env[68617]: DEBUG oslo_concurrency.lockutils [req-ca374aad-dc4b-45e8-9700-1ce49633e06b req-613caadf-51d6-4270-affd-03c762938322 service nova] Releasing lock "refresh_cache-2edb4d02-dec4-4e7d-9c57-6b2b147740ad" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2335.367054] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquiring lock "9a360442-5f4c-4379-a8d5-a6e09ac29ea9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2335.367401] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Lock "9a360442-5f4c-4379-a8d5-a6e09ac29ea9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2335.378815] env[68617]: DEBUG nova.compute.manager [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 9a360442-5f4c-4379-a8d5-a6e09ac29ea9] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2335.425914] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2335.426304] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2335.428090] env[68617]: INFO nova.compute.claims [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 9a360442-5f4c-4379-a8d5-a6e09ac29ea9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2335.562407] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b8852da-7d3f-43c4-98e1-ffe20eabd361 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2335.569930] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1cc4231f-1922-4bfd-8fc4-c735d5e9c1db {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2335.599142] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1871130-9900-42bb-881c-cbcba08fc328 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2335.606091] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eebd0d46-a5c4-49c0-be91-71fcbc85d244 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2335.620245] env[68617]: DEBUG nova.compute.provider_tree [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2335.630095] env[68617]: DEBUG nova.scheduler.client.report [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2335.643231] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.217s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2335.643762] env[68617]: DEBUG nova.compute.manager [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 9a360442-5f4c-4379-a8d5-a6e09ac29ea9] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2335.676659] env[68617]: DEBUG nova.compute.utils [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2335.677957] env[68617]: DEBUG nova.compute.manager [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 9a360442-5f4c-4379-a8d5-a6e09ac29ea9] Allocating IP information in the background. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2335.678142] env[68617]: DEBUG nova.network.neutron [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 9a360442-5f4c-4379-a8d5-a6e09ac29ea9] allocate_for_instance() {{(pid=68617) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2335.686868] env[68617]: DEBUG nova.compute.manager [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 9a360442-5f4c-4379-a8d5-a6e09ac29ea9] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2335.742758] env[68617]: DEBUG nova.policy [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '546f17dfba284c76b4ff2dde1a09928a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '162ecdbf203345a5b63167459e388608', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68617) authorize /opt/stack/nova/nova/policy.py:203}} [ 2335.748817] env[68617]: DEBUG nova.compute.manager [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 9a360442-5f4c-4379-a8d5-a6e09ac29ea9] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2335.773859] env[68617]: DEBUG nova.virt.hardware [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2335.774154] env[68617]: DEBUG nova.virt.hardware [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2335.774316] env[68617]: DEBUG nova.virt.hardware [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2335.774496] env[68617]: DEBUG nova.virt.hardware [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2335.774642] env[68617]: DEBUG nova.virt.hardware [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2335.774786] env[68617]: DEBUG nova.virt.hardware [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2335.774988] env[68617]: DEBUG nova.virt.hardware [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2335.775419] env[68617]: DEBUG nova.virt.hardware [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2335.775419] env[68617]: DEBUG nova.virt.hardware [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2335.775518] env[68617]: DEBUG nova.virt.hardware [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2335.775653] env[68617]: DEBUG nova.virt.hardware [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2335.776622] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50b86847-c4a1-4882-9a4c-371d4ec9e813 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2335.784365] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10bb67bd-0a8c-4306-8e66-2c3e4a3bfd56 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2336.038693] env[68617]: DEBUG nova.network.neutron [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 9a360442-5f4c-4379-a8d5-a6e09ac29ea9] Successfully created port: d2b8db83-b03b-42b1-9358-848e8bbe91e5 {{(pid=68617) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2336.760833] env[68617]: DEBUG nova.compute.manager [req-d3921a4d-5277-49e0-826f-deb1231987df req-ac50cde0-f869-4296-8377-b15c29409b8e service nova] [instance: 9a360442-5f4c-4379-a8d5-a6e09ac29ea9] Received event network-vif-plugged-d2b8db83-b03b-42b1-9358-848e8bbe91e5 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2336.761178] env[68617]: DEBUG oslo_concurrency.lockutils [req-d3921a4d-5277-49e0-826f-deb1231987df req-ac50cde0-f869-4296-8377-b15c29409b8e service nova] Acquiring lock "9a360442-5f4c-4379-a8d5-a6e09ac29ea9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2336.761331] env[68617]: DEBUG oslo_concurrency.lockutils [req-d3921a4d-5277-49e0-826f-deb1231987df req-ac50cde0-f869-4296-8377-b15c29409b8e service nova] Lock "9a360442-5f4c-4379-a8d5-a6e09ac29ea9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2336.761567] env[68617]: DEBUG oslo_concurrency.lockutils [req-d3921a4d-5277-49e0-826f-deb1231987df req-ac50cde0-f869-4296-8377-b15c29409b8e service nova] Lock "9a360442-5f4c-4379-a8d5-a6e09ac29ea9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2336.761715] env[68617]: DEBUG nova.compute.manager [req-d3921a4d-5277-49e0-826f-deb1231987df req-ac50cde0-f869-4296-8377-b15c29409b8e service nova] [instance: 9a360442-5f4c-4379-a8d5-a6e09ac29ea9] No waiting events found dispatching network-vif-plugged-d2b8db83-b03b-42b1-9358-848e8bbe91e5 {{(pid=68617) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2336.761882] env[68617]: WARNING nova.compute.manager [req-d3921a4d-5277-49e0-826f-deb1231987df req-ac50cde0-f869-4296-8377-b15c29409b8e service nova] [instance: 9a360442-5f4c-4379-a8d5-a6e09ac29ea9] Received unexpected event network-vif-plugged-d2b8db83-b03b-42b1-9358-848e8bbe91e5 for instance with vm_state building and task_state spawning. [ 2336.843923] env[68617]: DEBUG nova.network.neutron [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 9a360442-5f4c-4379-a8d5-a6e09ac29ea9] Successfully updated port: d2b8db83-b03b-42b1-9358-848e8bbe91e5 {{(pid=68617) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2336.855641] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquiring lock "refresh_cache-9a360442-5f4c-4379-a8d5-a6e09ac29ea9" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2336.855799] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquired lock "refresh_cache-9a360442-5f4c-4379-a8d5-a6e09ac29ea9" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2336.855980] env[68617]: DEBUG nova.network.neutron [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 9a360442-5f4c-4379-a8d5-a6e09ac29ea9] Building network info cache for instance {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2336.894771] env[68617]: DEBUG nova.network.neutron [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 9a360442-5f4c-4379-a8d5-a6e09ac29ea9] Instance cache missing network info. {{(pid=68617) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2337.049874] env[68617]: DEBUG nova.network.neutron [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 9a360442-5f4c-4379-a8d5-a6e09ac29ea9] Updating instance_info_cache with network_info: [{"id": "d2b8db83-b03b-42b1-9358-848e8bbe91e5", "address": "fa:16:3e:f1:0a:5d", "network": {"id": "e6650a9f-f26d-481d-8658-10ff40328891", "bridge": "br-int", "label": "tempest-ServersTestJSON-1149134727-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "162ecdbf203345a5b63167459e388608", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "aa09e855-8af1-419b-b78d-8ffcc94b1bfb", "external-id": "nsx-vlan-transportzone-901", "segmentation_id": 901, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd2b8db83-b0", "ovs_interfaceid": "d2b8db83-b03b-42b1-9358-848e8bbe91e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2337.063394] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Releasing lock "refresh_cache-9a360442-5f4c-4379-a8d5-a6e09ac29ea9" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2337.063803] env[68617]: DEBUG nova.compute.manager [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 9a360442-5f4c-4379-a8d5-a6e09ac29ea9] Instance network_info: |[{"id": "d2b8db83-b03b-42b1-9358-848e8bbe91e5", "address": "fa:16:3e:f1:0a:5d", "network": {"id": "e6650a9f-f26d-481d-8658-10ff40328891", "bridge": "br-int", "label": "tempest-ServersTestJSON-1149134727-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "162ecdbf203345a5b63167459e388608", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "aa09e855-8af1-419b-b78d-8ffcc94b1bfb", "external-id": "nsx-vlan-transportzone-901", "segmentation_id": 901, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd2b8db83-b0", "ovs_interfaceid": "d2b8db83-b03b-42b1-9358-848e8bbe91e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2337.064087] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 9a360442-5f4c-4379-a8d5-a6e09ac29ea9] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f1:0a:5d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'aa09e855-8af1-419b-b78d-8ffcc94b1bfb', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd2b8db83-b03b-42b1-9358-848e8bbe91e5', 'vif_model': 'vmxnet3'}] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2337.071632] env[68617]: DEBUG oslo.service.loopingcall [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2337.072422] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9a360442-5f4c-4379-a8d5-a6e09ac29ea9] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2337.072637] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3dc912fe-e155-457d-b556-1524a6bd3c01 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2337.093164] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2337.093164] env[68617]: value = "task-3470911" [ 2337.093164] env[68617]: _type = "Task" [ 2337.093164] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2337.100866] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470911, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2337.603536] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470911, 'name': CreateVM_Task, 'duration_secs': 0.326316} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2337.603744] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9a360442-5f4c-4379-a8d5-a6e09ac29ea9] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2337.604482] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2337.604640] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2337.604952] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2337.605218] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-13d39704-40d5-4a2a-b1ec-b2a4e3f38259 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2337.609753] env[68617]: DEBUG oslo_vmware.api [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Waiting for the task: (returnval){ [ 2337.609753] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]5255722a-2d6b-1a7b-e564-1c78f343f414" [ 2337.609753] env[68617]: _type = "Task" [ 2337.609753] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2337.618728] env[68617]: DEBUG oslo_vmware.api [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]5255722a-2d6b-1a7b-e564-1c78f343f414, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2338.120397] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2338.120688] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 9a360442-5f4c-4379-a8d5-a6e09ac29ea9] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2338.120911] env[68617]: DEBUG oslo_concurrency.lockutils [None req-a4263441-0e2e-4f1c-8599-75bfc813a52f tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2338.792246] env[68617]: DEBUG nova.compute.manager [req-84e5fbc2-1236-4f5c-82d8-1b4d2df2aede req-bf826923-7ba7-4f65-932e-5b4b89e8a7de service nova] [instance: 9a360442-5f4c-4379-a8d5-a6e09ac29ea9] Received event network-changed-d2b8db83-b03b-42b1-9358-848e8bbe91e5 {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2338.792457] env[68617]: DEBUG nova.compute.manager [req-84e5fbc2-1236-4f5c-82d8-1b4d2df2aede req-bf826923-7ba7-4f65-932e-5b4b89e8a7de service nova] [instance: 9a360442-5f4c-4379-a8d5-a6e09ac29ea9] Refreshing instance network info cache due to event network-changed-d2b8db83-b03b-42b1-9358-848e8bbe91e5. {{(pid=68617) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2338.792669] env[68617]: DEBUG oslo_concurrency.lockutils [req-84e5fbc2-1236-4f5c-82d8-1b4d2df2aede req-bf826923-7ba7-4f65-932e-5b4b89e8a7de service nova] Acquiring lock "refresh_cache-9a360442-5f4c-4379-a8d5-a6e09ac29ea9" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2338.792809] env[68617]: DEBUG oslo_concurrency.lockutils [req-84e5fbc2-1236-4f5c-82d8-1b4d2df2aede req-bf826923-7ba7-4f65-932e-5b4b89e8a7de service nova] Acquired lock "refresh_cache-9a360442-5f4c-4379-a8d5-a6e09ac29ea9" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2338.792988] env[68617]: DEBUG nova.network.neutron [req-84e5fbc2-1236-4f5c-82d8-1b4d2df2aede req-bf826923-7ba7-4f65-932e-5b4b89e8a7de service nova] [instance: 9a360442-5f4c-4379-a8d5-a6e09ac29ea9] Refreshing network info cache for port d2b8db83-b03b-42b1-9358-848e8bbe91e5 {{(pid=68617) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2339.018945] env[68617]: DEBUG nova.network.neutron [req-84e5fbc2-1236-4f5c-82d8-1b4d2df2aede req-bf826923-7ba7-4f65-932e-5b4b89e8a7de service nova] [instance: 9a360442-5f4c-4379-a8d5-a6e09ac29ea9] Updated VIF entry in instance network info cache for port d2b8db83-b03b-42b1-9358-848e8bbe91e5. {{(pid=68617) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2339.019323] env[68617]: DEBUG nova.network.neutron [req-84e5fbc2-1236-4f5c-82d8-1b4d2df2aede req-bf826923-7ba7-4f65-932e-5b4b89e8a7de service nova] [instance: 9a360442-5f4c-4379-a8d5-a6e09ac29ea9] Updating instance_info_cache with network_info: [{"id": "d2b8db83-b03b-42b1-9358-848e8bbe91e5", "address": "fa:16:3e:f1:0a:5d", "network": {"id": "e6650a9f-f26d-481d-8658-10ff40328891", "bridge": "br-int", "label": "tempest-ServersTestJSON-1149134727-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "162ecdbf203345a5b63167459e388608", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "aa09e855-8af1-419b-b78d-8ffcc94b1bfb", "external-id": "nsx-vlan-transportzone-901", "segmentation_id": 901, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd2b8db83-b0", "ovs_interfaceid": "d2b8db83-b03b-42b1-9358-848e8bbe91e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2339.028438] env[68617]: DEBUG oslo_concurrency.lockutils [req-84e5fbc2-1236-4f5c-82d8-1b4d2df2aede req-bf826923-7ba7-4f65-932e-5b4b89e8a7de service nova] Releasing lock "refresh_cache-9a360442-5f4c-4379-a8d5-a6e09ac29ea9" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2339.694337] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2342.699382] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2342.699741] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Starting heal instance info cache {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2342.699741] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Rebuilding the list of instances to heal {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2342.716858] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2342.717054] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2342.717154] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2342.717281] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2342.717405] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2342.717526] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 2edb4d02-dec4-4e7d-9c57-6b2b147740ad] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2342.717647] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 9a360442-5f4c-4379-a8d5-a6e09ac29ea9] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2342.717766] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Didn't find any instances for network info cache update. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2345.698842] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2345.699252] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2346.699121] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2346.699386] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2346.699582] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68617) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2349.700692] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2349.714674] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2349.714891] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2349.715074] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2349.715233] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2349.716347] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8390e6c-766c-4a0b-9592-fa7abbdd6432 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2349.724981] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0fe422d-ce07-4508-9aa3-f1739ccdadeb {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2349.738779] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b352f6c-668c-4cb4-a526-78e263e1dcba {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2349.744883] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4c63b54-8cff-411c-b744-e657d56b5116 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2349.774448] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180899MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2349.774589] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2349.774773] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2349.835570] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance a4ab788d-327a-47cc-8ae7-e1b9be889759 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2349.835737] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 17bb8415-dafd-47ed-9a14-52163ba5e7db actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2349.835899] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 82f72313-f493-4acd-a95e-765feb74a358 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2349.836050] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 797b434e-a913-43dc-a1df-39fe82da1221 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2349.836180] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6a1aa3fb-f182-4b9d-8add-7dfc70472be8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2349.836304] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 2edb4d02-dec4-4e7d-9c57-6b2b147740ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2349.836422] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 9a360442-5f4c-4379-a8d5-a6e09ac29ea9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2349.836604] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2349.836743] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2349.917298] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9e90eab-b9ef-4a3b-9da1-e11f5d40c505 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2349.924886] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c45b5254-e23b-47cd-9b18-a884d108553b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2349.954120] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a9f7b4b-2126-4cdf-bf04-4ea1ec4ca893 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2349.960799] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e65de298-cb9a-4ba2-8d36-a384e00d48d0 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2349.973353] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2349.982373] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2349.995202] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2349.995411] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.221s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2350.993893] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2351.694776] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2351.698352] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2363.168432] env[68617]: WARNING oslo_vmware.rw_handles [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2363.168432] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2363.168432] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2363.168432] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2363.168432] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2363.168432] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 2363.168432] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2363.168432] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2363.168432] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2363.168432] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2363.168432] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2363.168432] env[68617]: ERROR oslo_vmware.rw_handles [ 2363.169181] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/8f09f0ea-71f7-4445-af40-75d9ca69916d/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2363.171374] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2363.171620] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Copying Virtual Disk [datastore2] vmware_temp/8f09f0ea-71f7-4445-af40-75d9ca69916d/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/8f09f0ea-71f7-4445-af40-75d9ca69916d/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2363.171901] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d06c9404-0d3e-4ef9-a71f-0fe593802549 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2363.179711] env[68617]: DEBUG oslo_vmware.api [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Waiting for the task: (returnval){ [ 2363.179711] env[68617]: value = "task-3470912" [ 2363.179711] env[68617]: _type = "Task" [ 2363.179711] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2363.187733] env[68617]: DEBUG oslo_vmware.api [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Task: {'id': task-3470912, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2363.690581] env[68617]: DEBUG oslo_vmware.exceptions [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2363.690827] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2363.691385] env[68617]: ERROR nova.compute.manager [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2363.691385] env[68617]: Faults: ['InvalidArgument'] [ 2363.691385] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Traceback (most recent call last): [ 2363.691385] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2363.691385] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] yield resources [ 2363.691385] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2363.691385] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] self.driver.spawn(context, instance, image_meta, [ 2363.691385] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2363.691385] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2363.691385] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2363.691385] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] self._fetch_image_if_missing(context, vi) [ 2363.691385] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2363.691931] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] image_cache(vi, tmp_image_ds_loc) [ 2363.691931] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2363.691931] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] vm_util.copy_virtual_disk( [ 2363.691931] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2363.691931] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] session._wait_for_task(vmdk_copy_task) [ 2363.691931] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2363.691931] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] return self.wait_for_task(task_ref) [ 2363.691931] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2363.691931] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] return evt.wait() [ 2363.691931] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2363.691931] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] result = hub.switch() [ 2363.691931] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2363.691931] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] return self.greenlet.switch() [ 2363.692271] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2363.692271] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] self.f(*self.args, **self.kw) [ 2363.692271] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2363.692271] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] raise exceptions.translate_fault(task_info.error) [ 2363.692271] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2363.692271] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Faults: ['InvalidArgument'] [ 2363.692271] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] [ 2363.692271] env[68617]: INFO nova.compute.manager [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Terminating instance [ 2363.693475] env[68617]: DEBUG oslo_concurrency.lockutils [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2363.693681] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2363.693931] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0fb65ee0-1c2f-4437-b7ab-873ef2ae13fb {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2363.696221] env[68617]: DEBUG nova.compute.manager [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2363.696412] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2363.697143] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-400adb56-b76b-401d-a473-f73492cab0c1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2363.704111] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2363.704444] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d50ee70a-9606-41bf-9fb6-c2fd9c59f6b3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2363.706614] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2363.706784] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2363.707739] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-24949667-b88e-4def-ae2e-eab46f70c8ca {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2363.712387] env[68617]: DEBUG oslo_vmware.api [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Waiting for the task: (returnval){ [ 2363.712387] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52bf0016-41fa-4615-5dd9-a452423daebd" [ 2363.712387] env[68617]: _type = "Task" [ 2363.712387] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2363.719420] env[68617]: DEBUG oslo_vmware.api [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52bf0016-41fa-4615-5dd9-a452423daebd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2363.783448] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2363.783685] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2363.783800] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Deleting the datastore file [datastore2] a4ab788d-327a-47cc-8ae7-e1b9be889759 {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2363.784072] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-57d35046-e612-427a-92df-13bea7156c25 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2363.789981] env[68617]: DEBUG oslo_vmware.api [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Waiting for the task: (returnval){ [ 2363.789981] env[68617]: value = "task-3470914" [ 2363.789981] env[68617]: _type = "Task" [ 2363.789981] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2363.800680] env[68617]: DEBUG oslo_vmware.api [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Task: {'id': task-3470914, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2364.222151] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2364.222519] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Creating directory with path [datastore2] vmware_temp/dba7fca9-8ebc-4096-b40d-bea9e3209805/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2364.222649] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-457b04de-5c16-4f6c-af9d-e4e35845f9eb {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2364.233145] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Created directory with path [datastore2] vmware_temp/dba7fca9-8ebc-4096-b40d-bea9e3209805/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2364.233354] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Fetch image to [datastore2] vmware_temp/dba7fca9-8ebc-4096-b40d-bea9e3209805/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2364.233537] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/dba7fca9-8ebc-4096-b40d-bea9e3209805/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2364.234239] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89335f97-8d1e-4741-bad9-1850ae5139cf {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2364.240412] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ba22aa9-4e19-42c2-8930-ac95eb9cb624 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2364.249102] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb9df33a-455a-49f6-ab3e-ceec79babf95 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2364.279611] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3343a41d-bd49-4b72-91cb-6aa2e7a5093f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2364.285515] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-68a8f4d7-0851-461d-a9d9-1ac64b6a667b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2364.299115] env[68617]: DEBUG oslo_vmware.api [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Task: {'id': task-3470914, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069556} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2364.299374] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2364.299553] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2364.299782] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2364.300022] env[68617]: INFO nova.compute.manager [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2364.302244] env[68617]: DEBUG nova.compute.claims [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2364.302413] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2364.302625] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2364.306676] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2364.361449] env[68617]: DEBUG oslo_vmware.rw_handles [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/dba7fca9-8ebc-4096-b40d-bea9e3209805/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2364.420938] env[68617]: DEBUG oslo_vmware.rw_handles [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2364.421153] env[68617]: DEBUG oslo_vmware.rw_handles [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/dba7fca9-8ebc-4096-b40d-bea9e3209805/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2364.483195] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6597ec99-a061-4301-9761-4e07ad9aae91 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2364.490471] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4d551b9-1c09-4b83-ab1e-15c6df8f209c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2364.519532] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b61c419a-48ed-4158-a9d7-f96c9f77a2ac {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2364.526253] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e62ec9b0-8fbd-4944-81ed-285b8e654e72 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2364.538940] env[68617]: DEBUG nova.compute.provider_tree [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2364.547466] env[68617]: DEBUG nova.scheduler.client.report [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2364.561519] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.259s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2364.562042] env[68617]: ERROR nova.compute.manager [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2364.562042] env[68617]: Faults: ['InvalidArgument'] [ 2364.562042] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Traceback (most recent call last): [ 2364.562042] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2364.562042] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] self.driver.spawn(context, instance, image_meta, [ 2364.562042] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2364.562042] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2364.562042] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2364.562042] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] self._fetch_image_if_missing(context, vi) [ 2364.562042] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2364.562042] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] image_cache(vi, tmp_image_ds_loc) [ 2364.562042] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2364.562337] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] vm_util.copy_virtual_disk( [ 2364.562337] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2364.562337] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] session._wait_for_task(vmdk_copy_task) [ 2364.562337] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2364.562337] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] return self.wait_for_task(task_ref) [ 2364.562337] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2364.562337] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] return evt.wait() [ 2364.562337] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2364.562337] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] result = hub.switch() [ 2364.562337] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2364.562337] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] return self.greenlet.switch() [ 2364.562337] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2364.562337] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] self.f(*self.args, **self.kw) [ 2364.562618] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2364.562618] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] raise exceptions.translate_fault(task_info.error) [ 2364.562618] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2364.562618] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Faults: ['InvalidArgument'] [ 2364.562618] env[68617]: ERROR nova.compute.manager [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] [ 2364.562860] env[68617]: DEBUG nova.compute.utils [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2364.564123] env[68617]: DEBUG nova.compute.manager [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Build of instance a4ab788d-327a-47cc-8ae7-e1b9be889759 was re-scheduled: A specified parameter was not correct: fileType [ 2364.564123] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2364.564496] env[68617]: DEBUG nova.compute.manager [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2364.564668] env[68617]: DEBUG nova.compute.manager [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2364.564834] env[68617]: DEBUG nova.compute.manager [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2364.564992] env[68617]: DEBUG nova.network.neutron [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2364.860036] env[68617]: DEBUG nova.network.neutron [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2364.870050] env[68617]: INFO nova.compute.manager [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Took 0.30 seconds to deallocate network for instance. [ 2364.980942] env[68617]: INFO nova.scheduler.client.report [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Deleted allocations for instance a4ab788d-327a-47cc-8ae7-e1b9be889759 [ 2365.005314] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f583d842-47d4-495e-a979-256f4969d74e tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Lock "a4ab788d-327a-47cc-8ae7-e1b9be889759" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 666.940s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2365.005314] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1bcc9db0-15b4-4fc7-ac3d-d65126e3f188 tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Lock "a4ab788d-327a-47cc-8ae7-e1b9be889759" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 470.921s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2365.005542] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1bcc9db0-15b4-4fc7-ac3d-d65126e3f188 tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Acquiring lock "a4ab788d-327a-47cc-8ae7-e1b9be889759-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2365.005707] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1bcc9db0-15b4-4fc7-ac3d-d65126e3f188 tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Lock "a4ab788d-327a-47cc-8ae7-e1b9be889759-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2365.005921] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1bcc9db0-15b4-4fc7-ac3d-d65126e3f188 tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Lock "a4ab788d-327a-47cc-8ae7-e1b9be889759-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2365.007892] env[68617]: INFO nova.compute.manager [None req-1bcc9db0-15b4-4fc7-ac3d-d65126e3f188 tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Terminating instance [ 2365.010010] env[68617]: DEBUG nova.compute.manager [None req-1bcc9db0-15b4-4fc7-ac3d-d65126e3f188 tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2365.010010] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-1bcc9db0-15b4-4fc7-ac3d-d65126e3f188 tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2365.010275] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6cf85061-172d-40f4-9e8b-5d014a10b3fd {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2365.019270] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-157e1ac8-99ff-4db8-89d0-dda784904b2d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2365.046177] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-1bcc9db0-15b4-4fc7-ac3d-d65126e3f188 tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a4ab788d-327a-47cc-8ae7-e1b9be889759 could not be found. [ 2365.046390] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-1bcc9db0-15b4-4fc7-ac3d-d65126e3f188 tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2365.046569] env[68617]: INFO nova.compute.manager [None req-1bcc9db0-15b4-4fc7-ac3d-d65126e3f188 tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2365.046804] env[68617]: DEBUG oslo.service.loopingcall [None req-1bcc9db0-15b4-4fc7-ac3d-d65126e3f188 tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2365.047050] env[68617]: DEBUG nova.compute.manager [-] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2365.047152] env[68617]: DEBUG nova.network.neutron [-] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2365.072150] env[68617]: DEBUG nova.network.neutron [-] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2365.080298] env[68617]: INFO nova.compute.manager [-] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] Took 0.03 seconds to deallocate network for instance. [ 2365.176257] env[68617]: DEBUG oslo_concurrency.lockutils [None req-1bcc9db0-15b4-4fc7-ac3d-d65126e3f188 tempest-ServersTestFqdnHostnames-476153127 tempest-ServersTestFqdnHostnames-476153127-project-member] Lock "a4ab788d-327a-47cc-8ae7-e1b9be889759" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.171s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2365.177220] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "a4ab788d-327a-47cc-8ae7-e1b9be889759" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 461.526s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2365.177534] env[68617]: INFO nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: a4ab788d-327a-47cc-8ae7-e1b9be889759] During sync_power_state the instance has a pending task (deleting). Skip. [ 2365.177867] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "a4ab788d-327a-47cc-8ae7-e1b9be889759" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2404.699075] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2404.699075] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Starting heal instance info cache {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2404.699499] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Rebuilding the list of instances to heal {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2404.716364] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2404.716515] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2404.716646] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2404.716771] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2404.716895] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 2edb4d02-dec4-4e7d-9c57-6b2b147740ad] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2404.717019] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 9a360442-5f4c-4379-a8d5-a6e09ac29ea9] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2404.717149] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Didn't find any instances for network info cache update. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2406.699020] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2406.699378] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2406.699378] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2406.699536] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2406.699642] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68617) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2409.700056] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2409.711698] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2409.711925] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2409.712111] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2409.712270] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2409.713372] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0be8bdb1-901e-4d48-8f7c-6c838f87083d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2409.722958] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-535374e1-c711-4dbb-a93f-12f2642f8389 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2409.736682] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0305363b-4d60-4eb9-91a1-e46a818974d6 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2409.742894] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29d35d92-61a3-478e-8bc0-745a817d0c62 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2409.771383] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180853MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2409.771524] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2409.771705] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2409.828011] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 17bb8415-dafd-47ed-9a14-52163ba5e7db actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2409.828189] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 82f72313-f493-4acd-a95e-765feb74a358 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2409.828321] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 797b434e-a913-43dc-a1df-39fe82da1221 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2409.828445] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6a1aa3fb-f182-4b9d-8add-7dfc70472be8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2409.828565] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 2edb4d02-dec4-4e7d-9c57-6b2b147740ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2409.828683] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 9a360442-5f4c-4379-a8d5-a6e09ac29ea9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2409.828856] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2409.828990] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2409.904256] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75baf8d2-ee6c-40da-a8a2-d28052b34b7c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2409.911264] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bea8672-850d-4efd-ae2e-fd33a90ed3d2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2409.941417] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74482c9c-7a8e-4728-9530-69c7b50ade45 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2409.948009] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78f326ab-a0d2-4705-ac70-7cf008bd0080 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2409.960533] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2409.968837] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2409.981259] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2409.981428] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.210s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2410.981724] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2412.716446] env[68617]: WARNING oslo_vmware.rw_handles [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2412.716446] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2412.716446] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2412.716446] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2412.716446] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2412.716446] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 2412.716446] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2412.716446] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2412.716446] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2412.716446] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2412.716446] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2412.716446] env[68617]: ERROR oslo_vmware.rw_handles [ 2412.718422] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/dba7fca9-8ebc-4096-b40d-bea9e3209805/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2412.718916] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2412.719215] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Copying Virtual Disk [datastore2] vmware_temp/dba7fca9-8ebc-4096-b40d-bea9e3209805/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/dba7fca9-8ebc-4096-b40d-bea9e3209805/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2412.719502] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-3c124279-aa87-4ac9-99dd-2f33a52ae9c7 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2412.727879] env[68617]: DEBUG oslo_vmware.api [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Waiting for the task: (returnval){ [ 2412.727879] env[68617]: value = "task-3470915" [ 2412.727879] env[68617]: _type = "Task" [ 2412.727879] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2412.735793] env[68617]: DEBUG oslo_vmware.api [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Task: {'id': task-3470915, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2413.238238] env[68617]: DEBUG oslo_vmware.exceptions [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2413.238487] env[68617]: DEBUG oslo_concurrency.lockutils [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2413.239060] env[68617]: ERROR nova.compute.manager [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2413.239060] env[68617]: Faults: ['InvalidArgument'] [ 2413.239060] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Traceback (most recent call last): [ 2413.239060] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2413.239060] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] yield resources [ 2413.239060] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2413.239060] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] self.driver.spawn(context, instance, image_meta, [ 2413.239060] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2413.239060] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2413.239060] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2413.239060] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] self._fetch_image_if_missing(context, vi) [ 2413.239060] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2413.239400] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] image_cache(vi, tmp_image_ds_loc) [ 2413.239400] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2413.239400] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] vm_util.copy_virtual_disk( [ 2413.239400] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2413.239400] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] session._wait_for_task(vmdk_copy_task) [ 2413.239400] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2413.239400] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] return self.wait_for_task(task_ref) [ 2413.239400] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2413.239400] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] return evt.wait() [ 2413.239400] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2413.239400] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] result = hub.switch() [ 2413.239400] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2413.239400] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] return self.greenlet.switch() [ 2413.239906] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2413.239906] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] self.f(*self.args, **self.kw) [ 2413.239906] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2413.239906] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] raise exceptions.translate_fault(task_info.error) [ 2413.239906] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2413.239906] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Faults: ['InvalidArgument'] [ 2413.239906] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] [ 2413.239906] env[68617]: INFO nova.compute.manager [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Terminating instance [ 2413.241035] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2413.241147] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2413.241316] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5e7c9e62-fdec-4e9c-b79e-7d4912ea5c9b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2413.243430] env[68617]: DEBUG nova.compute.manager [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2413.243614] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2413.244336] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f74b72f-4907-472e-8da6-d29a0f24aea9 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2413.250908] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2413.251124] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-00dd1b4c-3b6c-4321-9e72-d824407a78dd {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2413.253179] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2413.253349] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2413.254326] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5d7c6b8a-59b9-4398-863c-06172703e4fa {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2413.259045] env[68617]: DEBUG oslo_vmware.api [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Waiting for the task: (returnval){ [ 2413.259045] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]5269472f-9229-f178-2f26-23237379683e" [ 2413.259045] env[68617]: _type = "Task" [ 2413.259045] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2413.266167] env[68617]: DEBUG oslo_vmware.api [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]5269472f-9229-f178-2f26-23237379683e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2413.319256] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2413.319499] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2413.319647] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Deleting the datastore file [datastore2] 17bb8415-dafd-47ed-9a14-52163ba5e7db {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2413.319904] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1b76ebbe-6508-4cf7-ba13-8a76fadce4bd {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2413.326440] env[68617]: DEBUG oslo_vmware.api [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Waiting for the task: (returnval){ [ 2413.326440] env[68617]: value = "task-3470917" [ 2413.326440] env[68617]: _type = "Task" [ 2413.326440] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2413.334059] env[68617]: DEBUG oslo_vmware.api [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Task: {'id': task-3470917, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2413.694152] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2413.698714] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2413.769101] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2413.769407] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Creating directory with path [datastore2] vmware_temp/794b5d85-c807-4fc6-a380-dfb54467a2c6/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2413.769577] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-80f64718-38d4-48d4-9c80-864d4aaba53d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2413.780441] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Created directory with path [datastore2] vmware_temp/794b5d85-c807-4fc6-a380-dfb54467a2c6/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2413.780630] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Fetch image to [datastore2] vmware_temp/794b5d85-c807-4fc6-a380-dfb54467a2c6/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2413.780796] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/794b5d85-c807-4fc6-a380-dfb54467a2c6/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2413.781506] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d15ab29e-b66f-4997-b9b9-b859e5699736 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2413.787787] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ecb55105-0ef8-4015-9307-bef22b8e8baa {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2413.796731] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67e07d01-2440-413c-a58e-fbe15bec1e0b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2413.831291] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b7cfff6-3c15-4d78-9f13-b0093b63c9f1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2413.838533] env[68617]: DEBUG oslo_vmware.api [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Task: {'id': task-3470917, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.080158} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2413.840108] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2413.840303] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2413.840474] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2413.840645] env[68617]: INFO nova.compute.manager [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2413.842391] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-3d12e797-c456-464c-8941-ff80417c0015 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2413.844347] env[68617]: DEBUG nova.compute.claims [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2413.844513] env[68617]: DEBUG oslo_concurrency.lockutils [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2413.844726] env[68617]: DEBUG oslo_concurrency.lockutils [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2413.866317] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2413.983015] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97d380ef-ff50-436e-8e4c-877267d38d4a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2413.992791] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8221a03-66e6-49fd-bae9-7ed0f78fde14 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2414.022159] env[68617]: DEBUG oslo_vmware.rw_handles [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/794b5d85-c807-4fc6-a380-dfb54467a2c6/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2414.023962] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afd41f74-9fb8-4b1c-b298-65d43844c644 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2414.085118] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ebb2d0b-53dd-4f56-8957-d1541899b759 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2414.089380] env[68617]: DEBUG oslo_vmware.rw_handles [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2414.089548] env[68617]: DEBUG oslo_vmware.rw_handles [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/794b5d85-c807-4fc6-a380-dfb54467a2c6/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2414.099205] env[68617]: DEBUG nova.compute.provider_tree [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2414.108444] env[68617]: DEBUG nova.scheduler.client.report [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2414.121923] env[68617]: DEBUG oslo_concurrency.lockutils [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.277s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2414.122119] env[68617]: ERROR nova.compute.manager [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2414.122119] env[68617]: Faults: ['InvalidArgument'] [ 2414.122119] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Traceback (most recent call last): [ 2414.122119] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2414.122119] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] self.driver.spawn(context, instance, image_meta, [ 2414.122119] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2414.122119] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2414.122119] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2414.122119] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] self._fetch_image_if_missing(context, vi) [ 2414.122119] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2414.122119] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] image_cache(vi, tmp_image_ds_loc) [ 2414.122119] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2414.122460] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] vm_util.copy_virtual_disk( [ 2414.122460] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2414.122460] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] session._wait_for_task(vmdk_copy_task) [ 2414.122460] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2414.122460] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] return self.wait_for_task(task_ref) [ 2414.122460] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2414.122460] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] return evt.wait() [ 2414.122460] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2414.122460] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] result = hub.switch() [ 2414.122460] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2414.122460] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] return self.greenlet.switch() [ 2414.122460] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2414.122460] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] self.f(*self.args, **self.kw) [ 2414.122734] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2414.122734] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] raise exceptions.translate_fault(task_info.error) [ 2414.122734] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2414.122734] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Faults: ['InvalidArgument'] [ 2414.122734] env[68617]: ERROR nova.compute.manager [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] [ 2414.122849] env[68617]: DEBUG nova.compute.utils [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2414.124272] env[68617]: DEBUG nova.compute.manager [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Build of instance 17bb8415-dafd-47ed-9a14-52163ba5e7db was re-scheduled: A specified parameter was not correct: fileType [ 2414.124272] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2414.124637] env[68617]: DEBUG nova.compute.manager [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2414.124816] env[68617]: DEBUG nova.compute.manager [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2414.125013] env[68617]: DEBUG nova.compute.manager [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2414.125191] env[68617]: DEBUG nova.network.neutron [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2414.423450] env[68617]: DEBUG nova.network.neutron [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2414.433934] env[68617]: INFO nova.compute.manager [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Took 0.31 seconds to deallocate network for instance. [ 2414.550421] env[68617]: INFO nova.scheduler.client.report [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Deleted allocations for instance 17bb8415-dafd-47ed-9a14-52163ba5e7db [ 2414.573058] env[68617]: DEBUG oslo_concurrency.lockutils [None req-90379122-3e0d-4ff2-b40b-384d7a25b6f2 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Lock "17bb8415-dafd-47ed-9a14-52163ba5e7db" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 495.317s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2414.573335] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d97fd8ce-beab-46d3-890b-7260e3649488 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Lock "17bb8415-dafd-47ed-9a14-52163ba5e7db" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 298.748s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2414.573556] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d97fd8ce-beab-46d3-890b-7260e3649488 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Acquiring lock "17bb8415-dafd-47ed-9a14-52163ba5e7db-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2414.573758] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d97fd8ce-beab-46d3-890b-7260e3649488 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Lock "17bb8415-dafd-47ed-9a14-52163ba5e7db-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2414.574089] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d97fd8ce-beab-46d3-890b-7260e3649488 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Lock "17bb8415-dafd-47ed-9a14-52163ba5e7db-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2414.576105] env[68617]: INFO nova.compute.manager [None req-d97fd8ce-beab-46d3-890b-7260e3649488 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Terminating instance [ 2414.577804] env[68617]: DEBUG nova.compute.manager [None req-d97fd8ce-beab-46d3-890b-7260e3649488 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2414.578076] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d97fd8ce-beab-46d3-890b-7260e3649488 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2414.578546] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c863eca0-8adb-499e-a24b-f7ea7e7e7e94 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2414.587976] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fec6db25-e156-42fa-98ca-3fd18d819ccf {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2414.615930] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-d97fd8ce-beab-46d3-890b-7260e3649488 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 17bb8415-dafd-47ed-9a14-52163ba5e7db could not be found. [ 2414.616146] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-d97fd8ce-beab-46d3-890b-7260e3649488 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2414.616398] env[68617]: INFO nova.compute.manager [None req-d97fd8ce-beab-46d3-890b-7260e3649488 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2414.616649] env[68617]: DEBUG oslo.service.loopingcall [None req-d97fd8ce-beab-46d3-890b-7260e3649488 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2414.616858] env[68617]: DEBUG nova.compute.manager [-] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2414.616956] env[68617]: DEBUG nova.network.neutron [-] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2414.640748] env[68617]: DEBUG nova.network.neutron [-] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2414.648553] env[68617]: INFO nova.compute.manager [-] [instance: 17bb8415-dafd-47ed-9a14-52163ba5e7db] Took 0.03 seconds to deallocate network for instance. [ 2414.735718] env[68617]: DEBUG oslo_concurrency.lockutils [None req-d97fd8ce-beab-46d3-890b-7260e3649488 tempest-AttachVolumeNegativeTest-555816992 tempest-AttachVolumeNegativeTest-555816992-project-member] Lock "17bb8415-dafd-47ed-9a14-52163ba5e7db" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.162s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2459.357956] env[68617]: WARNING oslo_vmware.rw_handles [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2459.357956] env[68617]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2459.357956] env[68617]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2459.357956] env[68617]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2459.357956] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2459.357956] env[68617]: ERROR oslo_vmware.rw_handles response.begin() [ 2459.357956] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2459.357956] env[68617]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2459.357956] env[68617]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2459.357956] env[68617]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2459.357956] env[68617]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2459.357956] env[68617]: ERROR oslo_vmware.rw_handles [ 2459.359043] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Downloaded image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to vmware_temp/794b5d85-c807-4fc6-a380-dfb54467a2c6/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2459.360439] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Caching image {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2459.360678] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Copying Virtual Disk [datastore2] vmware_temp/794b5d85-c807-4fc6-a380-dfb54467a2c6/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk to [datastore2] vmware_temp/794b5d85-c807-4fc6-a380-dfb54467a2c6/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk {{(pid=68617) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2459.360960] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-644e37da-e4a0-4b69-a0f8-2e2aeb0493e3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2459.368631] env[68617]: DEBUG oslo_vmware.api [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Waiting for the task: (returnval){ [ 2459.368631] env[68617]: value = "task-3470918" [ 2459.368631] env[68617]: _type = "Task" [ 2459.368631] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2459.376652] env[68617]: DEBUG oslo_vmware.api [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Task: {'id': task-3470918, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2459.879175] env[68617]: DEBUG oslo_vmware.exceptions [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Fault InvalidArgument not matched. {{(pid=68617) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2459.879479] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2459.880048] env[68617]: ERROR nova.compute.manager [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2459.880048] env[68617]: Faults: ['InvalidArgument'] [ 2459.880048] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] Traceback (most recent call last): [ 2459.880048] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2459.880048] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] yield resources [ 2459.880048] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2459.880048] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] self.driver.spawn(context, instance, image_meta, [ 2459.880048] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2459.880048] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2459.880048] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2459.880048] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] self._fetch_image_if_missing(context, vi) [ 2459.880048] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2459.880630] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] image_cache(vi, tmp_image_ds_loc) [ 2459.880630] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2459.880630] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] vm_util.copy_virtual_disk( [ 2459.880630] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2459.880630] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] session._wait_for_task(vmdk_copy_task) [ 2459.880630] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2459.880630] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] return self.wait_for_task(task_ref) [ 2459.880630] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2459.880630] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] return evt.wait() [ 2459.880630] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2459.880630] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] result = hub.switch() [ 2459.880630] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2459.880630] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] return self.greenlet.switch() [ 2459.880992] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2459.880992] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] self.f(*self.args, **self.kw) [ 2459.880992] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2459.880992] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] raise exceptions.translate_fault(task_info.error) [ 2459.880992] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2459.880992] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] Faults: ['InvalidArgument'] [ 2459.880992] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] [ 2459.880992] env[68617]: INFO nova.compute.manager [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Terminating instance [ 2459.881850] env[68617]: DEBUG oslo_concurrency.lockutils [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2459.882067] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2459.882303] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9de03fe0-5cf6-4a9a-9852-79d7c4dd8618 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2459.884404] env[68617]: DEBUG nova.compute.manager [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2459.884625] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2459.885347] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cefd3ecc-826b-444a-b227-9dd4f510d37c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2459.892022] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Unregistering the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2459.892205] env[68617]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-7e1b7797-77c1-4bb9-87ea-e64a2fec0dba {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2459.894298] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2459.894490] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68617) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2459.895435] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-10ede16c-fabd-42d9-a864-d595697225fa {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2459.900040] env[68617]: DEBUG oslo_vmware.api [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Waiting for the task: (returnval){ [ 2459.900040] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52387cf5-f60a-14fa-591d-223d56860e69" [ 2459.900040] env[68617]: _type = "Task" [ 2459.900040] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2459.907177] env[68617]: DEBUG oslo_vmware.api [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52387cf5-f60a-14fa-591d-223d56860e69, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2460.409615] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Preparing fetch location {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2460.410021] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Creating directory with path [datastore2] vmware_temp/785e4681-3df0-4daa-90bd-2a5a2682f3ff/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2460.410091] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1064593e-1f35-4080-a73b-e238db59bdae {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2460.429387] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Created directory with path [datastore2] vmware_temp/785e4681-3df0-4daa-90bd-2a5a2682f3ff/c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2460.429582] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Fetch image to [datastore2] vmware_temp/785e4681-3df0-4daa-90bd-2a5a2682f3ff/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2460.429738] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to [datastore2] vmware_temp/785e4681-3df0-4daa-90bd-2a5a2682f3ff/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk on the data store datastore2 {{(pid=68617) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2460.430496] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87560fe6-5d0d-4c8a-9100-0219138b6b2d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2460.436956] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5762a93a-1ff9-4e79-b05d-465cef6e4573 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2460.446026] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d41dadf3-679b-4f2e-8a81-009590582e70 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2461.222275] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6fa7ad8-8d9c-4999-bbc0-fa9f07d4afa0 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2461.224869] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Unregistered the VM {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2461.225076] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Deleting contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2461.225248] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Deleting the datastore file [datastore2] 82f72313-f493-4acd-a95e-765feb74a358 {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2461.225481] env[68617]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8b2a43c4-cbf9-4ed9-a8dc-faa964519cc0 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2461.232149] env[68617]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-88e16ccf-b78a-483b-8bff-645c5de552a8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2461.233769] env[68617]: DEBUG oslo_vmware.api [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Waiting for the task: (returnval){ [ 2461.233769] env[68617]: value = "task-3470920" [ 2461.233769] env[68617]: _type = "Task" [ 2461.233769] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2461.241434] env[68617]: DEBUG oslo_vmware.api [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Task: {'id': task-3470920, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2461.253859] env[68617]: DEBUG nova.virt.vmwareapi.images [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Downloading image file data c87eab51-bc9a-44dc-8f0d-7ab73283e453 to the data store datastore2 {{(pid=68617) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2461.303619] env[68617]: DEBUG oslo_vmware.rw_handles [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/785e4681-3df0-4daa-90bd-2a5a2682f3ff/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2461.364535] env[68617]: DEBUG oslo_vmware.rw_handles [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Completed reading data from the image iterator. {{(pid=68617) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2461.364738] env[68617]: DEBUG oslo_vmware.rw_handles [None req-11ba7c74-60df-4b2d-81bc-a1e94794fc7c tempest-ServersTestJSON-1350841761 tempest-ServersTestJSON-1350841761-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/785e4681-3df0-4daa-90bd-2a5a2682f3ff/c87eab51-bc9a-44dc-8f0d-7ab73283e453/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68617) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2461.743562] env[68617]: DEBUG oslo_vmware.api [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Task: {'id': task-3470920, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067872} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2461.743981] env[68617]: DEBUG nova.virt.vmwareapi.ds_util [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Deleted the datastore file {{(pid=68617) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2461.743981] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Deleted contents of the VM from datastore datastore2 {{(pid=68617) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2461.744150] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2461.744323] env[68617]: INFO nova.compute.manager [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Took 1.86 seconds to destroy the instance on the hypervisor. [ 2461.746430] env[68617]: DEBUG nova.compute.claims [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Aborting claim: {{(pid=68617) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2461.746598] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2461.746828] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2461.858171] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a20791d-f7b1-4268-a464-27ecc6c9c493 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2461.865616] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1156354f-89fa-46a7-bc8b-bb8ba20bc6c3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2461.894578] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96d6ecb0-0c81-4d25-9a33-54c08670a5a3 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2461.901496] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd2009b0-2167-44dd-9e65-b0892f82c0ac {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2461.914777] env[68617]: DEBUG nova.compute.provider_tree [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2461.923221] env[68617]: DEBUG nova.scheduler.client.report [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2461.936360] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.189s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2461.936887] env[68617]: ERROR nova.compute.manager [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2461.936887] env[68617]: Faults: ['InvalidArgument'] [ 2461.936887] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] Traceback (most recent call last): [ 2461.936887] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2461.936887] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] self.driver.spawn(context, instance, image_meta, [ 2461.936887] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2461.936887] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2461.936887] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2461.936887] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] self._fetch_image_if_missing(context, vi) [ 2461.936887] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2461.936887] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] image_cache(vi, tmp_image_ds_loc) [ 2461.936887] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2461.937206] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] vm_util.copy_virtual_disk( [ 2461.937206] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2461.937206] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] session._wait_for_task(vmdk_copy_task) [ 2461.937206] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2461.937206] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] return self.wait_for_task(task_ref) [ 2461.937206] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2461.937206] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] return evt.wait() [ 2461.937206] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2461.937206] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] result = hub.switch() [ 2461.937206] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2461.937206] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] return self.greenlet.switch() [ 2461.937206] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2461.937206] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] self.f(*self.args, **self.kw) [ 2461.937628] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2461.937628] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] raise exceptions.translate_fault(task_info.error) [ 2461.937628] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2461.937628] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] Faults: ['InvalidArgument'] [ 2461.937628] env[68617]: ERROR nova.compute.manager [instance: 82f72313-f493-4acd-a95e-765feb74a358] [ 2461.937628] env[68617]: DEBUG nova.compute.utils [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] VimFaultException {{(pid=68617) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2461.938870] env[68617]: DEBUG nova.compute.manager [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Build of instance 82f72313-f493-4acd-a95e-765feb74a358 was re-scheduled: A specified parameter was not correct: fileType [ 2461.938870] env[68617]: Faults: ['InvalidArgument'] {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2461.939265] env[68617]: DEBUG nova.compute.manager [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Unplugging VIFs for instance {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2461.939436] env[68617]: DEBUG nova.compute.manager [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68617) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2461.939602] env[68617]: DEBUG nova.compute.manager [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2461.939760] env[68617]: DEBUG nova.network.neutron [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2462.231945] env[68617]: DEBUG nova.network.neutron [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2462.245351] env[68617]: INFO nova.compute.manager [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Took 0.31 seconds to deallocate network for instance. [ 2462.341862] env[68617]: INFO nova.scheduler.client.report [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Deleted allocations for instance 82f72313-f493-4acd-a95e-765feb74a358 [ 2462.365225] env[68617]: DEBUG oslo_concurrency.lockutils [None req-ab944065-9ecb-494a-8459-4b83d3de308c tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "82f72313-f493-4acd-a95e-765feb74a358" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 530.597s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2462.365493] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0b8ea8c9-9b88-4e1e-98d8-6a2daeda0c01 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "82f72313-f493-4acd-a95e-765feb74a358" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 334.655s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2462.365737] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0b8ea8c9-9b88-4e1e-98d8-6a2daeda0c01 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Acquiring lock "82f72313-f493-4acd-a95e-765feb74a358-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2462.365909] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0b8ea8c9-9b88-4e1e-98d8-6a2daeda0c01 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "82f72313-f493-4acd-a95e-765feb74a358-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2462.366089] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0b8ea8c9-9b88-4e1e-98d8-6a2daeda0c01 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "82f72313-f493-4acd-a95e-765feb74a358-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2462.369746] env[68617]: INFO nova.compute.manager [None req-0b8ea8c9-9b88-4e1e-98d8-6a2daeda0c01 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Terminating instance [ 2462.371804] env[68617]: DEBUG nova.compute.manager [None req-0b8ea8c9-9b88-4e1e-98d8-6a2daeda0c01 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Start destroying the instance on the hypervisor. {{(pid=68617) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2462.372018] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0b8ea8c9-9b88-4e1e-98d8-6a2daeda0c01 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Destroying instance {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2462.372312] env[68617]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-e7357a50-fd5e-4ecf-9b6d-72a55f4acba4 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2462.383236] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c5cde62-1385-4501-ac01-07403c8a29ff {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2462.410189] env[68617]: WARNING nova.virt.vmwareapi.vmops [None req-0b8ea8c9-9b88-4e1e-98d8-6a2daeda0c01 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 82f72313-f493-4acd-a95e-765feb74a358 could not be found. [ 2462.410399] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-0b8ea8c9-9b88-4e1e-98d8-6a2daeda0c01 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Instance destroyed {{(pid=68617) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2462.410573] env[68617]: INFO nova.compute.manager [None req-0b8ea8c9-9b88-4e1e-98d8-6a2daeda0c01 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2462.410803] env[68617]: DEBUG oslo.service.loopingcall [None req-0b8ea8c9-9b88-4e1e-98d8-6a2daeda0c01 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2462.411310] env[68617]: DEBUG nova.compute.manager [-] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Deallocating network for instance {{(pid=68617) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2462.411472] env[68617]: DEBUG nova.network.neutron [-] [instance: 82f72313-f493-4acd-a95e-765feb74a358] deallocate_for_instance() {{(pid=68617) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2462.435665] env[68617]: DEBUG nova.network.neutron [-] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Updating instance_info_cache with network_info: [] {{(pid=68617) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2462.444891] env[68617]: INFO nova.compute.manager [-] [instance: 82f72313-f493-4acd-a95e-765feb74a358] Took 0.03 seconds to deallocate network for instance. [ 2462.539628] env[68617]: DEBUG oslo_concurrency.lockutils [None req-0b8ea8c9-9b88-4e1e-98d8-6a2daeda0c01 tempest-DeleteServersTestJSON-1358576707 tempest-DeleteServersTestJSON-1358576707-project-member] Lock "82f72313-f493-4acd-a95e-765feb74a358" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.174s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2463.693710] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2464.699364] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2464.699645] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Cleaning up deleted instances {{(pid=68617) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 2464.708220] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] There are 0 instances to clean {{(pid=68617) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 2465.707596] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2465.707849] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Starting heal instance info cache {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2465.707917] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Rebuilding the list of instances to heal {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2465.721035] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 797b434e-a913-43dc-a1df-39fe82da1221] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2465.721170] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 6a1aa3fb-f182-4b9d-8add-7dfc70472be8] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2465.721258] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 2edb4d02-dec4-4e7d-9c57-6b2b147740ad] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2465.721386] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] [instance: 9a360442-5f4c-4379-a8d5-a6e09ac29ea9] Skipping network cache update for instance because it is Building. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2465.721512] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Didn't find any instances for network info cache update. {{(pid=68617) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2466.698698] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2466.698931] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2466.699192] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2467.707158] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2468.699500] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2468.699656] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68617) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2468.699844] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2468.699975] env[68617]: DEBUG nova.compute.manager [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Cleaning up deleted instances with incomplete migration {{(pid=68617) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 2471.707940] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2471.720079] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2471.720316] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2471.720484] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2471.720637] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68617) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2471.721927] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ace87e13-e28e-4da6-95ee-834f2f5b5f85 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2471.730816] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a261fe76-872c-4288-b624-0b6943bbed80 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2471.744223] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8ffbc83-8285-489b-813c-c026f6441934 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2471.750077] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67ff14f5-0430-4467-94ce-3bea801dfc09 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2471.778285] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180933MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68617) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2471.778429] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2471.778612] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2471.910987] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 797b434e-a913-43dc-a1df-39fe82da1221 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2471.911182] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 6a1aa3fb-f182-4b9d-8add-7dfc70472be8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2471.911308] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 2edb4d02-dec4-4e7d-9c57-6b2b147740ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2471.911427] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Instance 9a360442-5f4c-4379-a8d5-a6e09ac29ea9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68617) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2471.911609] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Total usable vcpus: 48, total allocated vcpus: 4 {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2471.911745] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1024MB phys_disk=200GB used_disk=4GB total_vcpus=48 used_vcpus=4 pci_stats=[] {{(pid=68617) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2471.927567] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Refreshing inventories for resource provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 2471.939932] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Updating ProviderTree inventory for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 2471.940121] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Updating inventory in ProviderTree for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2471.949940] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Refreshing aggregate associations for resource provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f, aggregates: None {{(pid=68617) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 2471.965582] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Refreshing trait associations for resource provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_VMDK {{(pid=68617) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 2472.017094] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0337bada-d974-4824-9361-f9289b0c4eb2 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2472.024712] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb0bcd1f-4d65-4c23-b966-2d1557778434 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2472.053264] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94bb42ef-0ce5-4248-86f7-364342f94823 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2472.059817] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e95a077e-332e-43ec-897b-2b96d134fefd {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2472.073187] env[68617]: DEBUG nova.compute.provider_tree [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2472.081867] env[68617]: DEBUG nova.scheduler.client.report [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2472.095863] env[68617]: DEBUG nova.compute.resource_tracker [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68617) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2472.096092] env[68617]: DEBUG oslo_concurrency.lockutils [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.317s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2473.088053] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2473.694397] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2473.698093] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2483.238151] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Acquiring lock "5624d841-dffb-4a03-b87d-8d77e3a15755" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2483.238151] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Lock "5624d841-dffb-4a03-b87d-8d77e3a15755" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2483.254334] env[68617]: DEBUG nova.compute.manager [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] [instance: 5624d841-dffb-4a03-b87d-8d77e3a15755] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2483.318039] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2483.318039] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2483.318039] env[68617]: INFO nova.compute.claims [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] [instance: 5624d841-dffb-4a03-b87d-8d77e3a15755] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2483.411580] env[68617]: DEBUG oslo_concurrency.lockutils [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Acquiring lock "2ad7b4a1-ba47-4835-9dad-c0191a76fee2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2483.411824] env[68617]: DEBUG oslo_concurrency.lockutils [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Lock "2ad7b4a1-ba47-4835-9dad-c0191a76fee2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2483.422377] env[68617]: DEBUG nova.compute.manager [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] [instance: 2ad7b4a1-ba47-4835-9dad-c0191a76fee2] Starting instance... {{(pid=68617) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2483.464099] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25b6b9d9-a8a7-4510-8780-0c3fa792572c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2483.470241] env[68617]: DEBUG oslo_concurrency.lockutils [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2483.471265] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5d45fe7-bb8a-47a0-adc8-40461a178965 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2483.501445] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a330983d-44a1-40e6-85c2-494539faf73a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2483.508354] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46682bae-0202-420a-b062-09186a2e63ee {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2483.521336] env[68617]: DEBUG nova.compute.provider_tree [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2483.529718] env[68617]: DEBUG nova.scheduler.client.report [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2483.542542] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.226s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2483.542988] env[68617]: DEBUG nova.compute.manager [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] [instance: 5624d841-dffb-4a03-b87d-8d77e3a15755] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2483.545137] env[68617]: DEBUG oslo_concurrency.lockutils [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.075s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2483.546445] env[68617]: INFO nova.compute.claims [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] [instance: 2ad7b4a1-ba47-4835-9dad-c0191a76fee2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2483.576494] env[68617]: DEBUG nova.compute.utils [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2483.577635] env[68617]: DEBUG nova.compute.manager [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] [instance: 5624d841-dffb-4a03-b87d-8d77e3a15755] Not allocating networking since 'none' was specified. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 2483.589306] env[68617]: DEBUG nova.compute.manager [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] [instance: 5624d841-dffb-4a03-b87d-8d77e3a15755] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2483.648804] env[68617]: DEBUG nova.compute.manager [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] [instance: 5624d841-dffb-4a03-b87d-8d77e3a15755] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2483.674190] env[68617]: DEBUG nova.virt.hardware [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2483.674459] env[68617]: DEBUG nova.virt.hardware [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2483.674618] env[68617]: DEBUG nova.virt.hardware [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2483.674864] env[68617]: DEBUG nova.virt.hardware [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2483.675032] env[68617]: DEBUG nova.virt.hardware [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2483.675184] env[68617]: DEBUG nova.virt.hardware [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2483.675536] env[68617]: DEBUG nova.virt.hardware [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2483.675773] env[68617]: DEBUG nova.virt.hardware [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2483.675972] env[68617]: DEBUG nova.virt.hardware [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2483.676157] env[68617]: DEBUG nova.virt.hardware [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2483.676332] env[68617]: DEBUG nova.virt.hardware [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2483.677194] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc0f2307-b1fd-4e30-8e8e-a29f0670ee1f {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2483.687176] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69c4102c-f982-4f90-98d8-75e3b8749fc8 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2483.692840] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74a83a35-8f4a-4952-9464-2082740db8f1 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2483.706768] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc8ceff1-a5fb-401b-a6a4-22c677c46f6b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2483.709787] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] [instance: 5624d841-dffb-4a03-b87d-8d77e3a15755] Instance VIF info [] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2483.715317] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Creating folder: Project (1c10614c215a4f8cbba1fbb156b42a17). Parent ref: group-v693691. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2483.715546] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c9d8825c-4526-4a5e-add3-ccfd85ecf684 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2483.746707] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fbde48e-bbbc-48af-b92d-300b1d7fc56d {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2483.749131] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Created folder: Project (1c10614c215a4f8cbba1fbb156b42a17) in parent group-v693691. [ 2483.749305] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Creating folder: Instances. Parent ref: group-v693801. {{(pid=68617) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2483.749505] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8dfafbb4-ba21-48ca-96b1-bc8f6d73b86a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2483.756097] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6cc5cb5-cfb4-48fc-b6c9-13097e4cf263 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2483.760266] env[68617]: INFO nova.virt.vmwareapi.vm_util [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Created folder: Instances in parent group-v693801. [ 2483.760477] env[68617]: DEBUG oslo.service.loopingcall [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2483.760928] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5624d841-dffb-4a03-b87d-8d77e3a15755] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2483.761127] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-34afc054-f433-42b5-a517-f1bb5faab46a {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2483.780715] env[68617]: DEBUG nova.compute.provider_tree [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Inventory has not changed in ProviderTree for provider: 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f {{(pid=68617) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2483.785803] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2483.785803] env[68617]: value = "task-3470923" [ 2483.785803] env[68617]: _type = "Task" [ 2483.785803] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2483.789801] env[68617]: DEBUG nova.scheduler.client.report [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Inventory has not changed for provider 5d1262ef-b3ca-43a9-aa2d-64a8e3cd563f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68617) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2483.797517] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470923, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2483.805064] env[68617]: DEBUG oslo_concurrency.lockutils [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.260s {{(pid=68617) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2483.805523] env[68617]: DEBUG nova.compute.manager [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] [instance: 2ad7b4a1-ba47-4835-9dad-c0191a76fee2] Start building networks asynchronously for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2483.839405] env[68617]: DEBUG nova.compute.utils [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Using /dev/sd instead of None {{(pid=68617) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2483.841389] env[68617]: DEBUG nova.compute.manager [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] [instance: 2ad7b4a1-ba47-4835-9dad-c0191a76fee2] Not allocating networking since 'none' was specified. {{(pid=68617) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 2483.853294] env[68617]: DEBUG nova.compute.manager [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] [instance: 2ad7b4a1-ba47-4835-9dad-c0191a76fee2] Start building block device mappings for instance. {{(pid=68617) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2483.922475] env[68617]: DEBUG nova.compute.manager [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] [instance: 2ad7b4a1-ba47-4835-9dad-c0191a76fee2] Start spawning the instance on the hypervisor. {{(pid=68617) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2483.944660] env[68617]: DEBUG nova.virt.hardware [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T05:31:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T05:31:25Z,direct_url=,disk_format='vmdk',id=c87eab51-bc9a-44dc-8f0d-7ab73283e453,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f1a3ab6230dd468b8019424ce71de8ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T05:31:26Z,virtual_size=,visibility=), allow threads: False {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2483.945026] env[68617]: DEBUG nova.virt.hardware [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Flavor limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2483.945190] env[68617]: DEBUG nova.virt.hardware [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Image limits 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2483.945377] env[68617]: DEBUG nova.virt.hardware [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Flavor pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2483.945521] env[68617]: DEBUG nova.virt.hardware [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Image pref 0:0:0 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2483.945667] env[68617]: DEBUG nova.virt.hardware [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68617) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2483.945870] env[68617]: DEBUG nova.virt.hardware [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2483.946042] env[68617]: DEBUG nova.virt.hardware [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2483.946214] env[68617]: DEBUG nova.virt.hardware [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Got 1 possible topologies {{(pid=68617) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2483.946375] env[68617]: DEBUG nova.virt.hardware [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2483.946544] env[68617]: DEBUG nova.virt.hardware [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68617) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2483.947468] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-117c3a46-3768-4544-8010-b9654bfda38c {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2483.957028] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-137af211-638d-4dc1-884b-9918ab8671ae {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2483.972180] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] [instance: 2ad7b4a1-ba47-4835-9dad-c0191a76fee2] Instance VIF info [] {{(pid=68617) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2483.977768] env[68617]: DEBUG oslo.service.loopingcall [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68617) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2483.978027] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2ad7b4a1-ba47-4835-9dad-c0191a76fee2] Creating VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2483.978236] env[68617]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-401ac3d9-31fd-4729-9a7f-ab10259b3d2b {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2483.993677] env[68617]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2483.993677] env[68617]: value = "task-3470924" [ 2483.993677] env[68617]: _type = "Task" [ 2483.993677] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2484.001290] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470924, 'name': CreateVM_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2484.296237] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470923, 'name': CreateVM_Task, 'duration_secs': 0.273015} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2484.296539] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5624d841-dffb-4a03-b87d-8d77e3a15755] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2484.296946] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2484.297021] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2484.297341] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2484.297574] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c51af638-7ceb-4335-bb2f-30749fa56465 {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2484.302122] env[68617]: DEBUG oslo_vmware.api [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Waiting for the task: (returnval){ [ 2484.302122] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]52251f73-a159-80a1-35b6-183b40eb4439" [ 2484.302122] env[68617]: _type = "Task" [ 2484.302122] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2484.309682] env[68617]: DEBUG oslo_vmware.api [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]52251f73-a159-80a1-35b6-183b40eb4439, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2484.503138] env[68617]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470924, 'name': CreateVM_Task, 'duration_secs': 0.248586} completed successfully. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2484.503271] env[68617]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2ad7b4a1-ba47-4835-9dad-c0191a76fee2] Created VM on the ESX host {{(pid=68617) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2484.503653] env[68617]: DEBUG oslo_concurrency.lockutils [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2484.813118] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2484.813318] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] [instance: 5624d841-dffb-4a03-b87d-8d77e3a15755] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2484.813533] env[68617]: DEBUG oslo_concurrency.lockutils [None req-c2482455-7245-4e41-9d95-1758ecee40d5 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2484.813745] env[68617]: DEBUG oslo_concurrency.lockutils [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Acquired lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2484.814114] env[68617]: DEBUG oslo_concurrency.lockutils [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2484.814370] env[68617]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ae134153-4530-43bf-915c-f36f325a58df {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2484.820045] env[68617]: DEBUG oslo_vmware.api [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Waiting for the task: (returnval){ [ 2484.820045] env[68617]: value = "session[527781b0-b30d-888c-2cc2-ff79c79797ba]5285c924-5aea-97ab-75c8-257f5fc8e6cd" [ 2484.820045] env[68617]: _type = "Task" [ 2484.820045] env[68617]: } to complete. {{(pid=68617) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2484.827567] env[68617]: DEBUG oslo_vmware.api [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Task: {'id': session[527781b0-b30d-888c-2cc2-ff79c79797ba]5285c924-5aea-97ab-75c8-257f5fc8e6cd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68617) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2485.330320] env[68617]: DEBUG oslo_concurrency.lockutils [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Releasing lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2485.330598] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] [instance: 2ad7b4a1-ba47-4835-9dad-c0191a76fee2] Processing image c87eab51-bc9a-44dc-8f0d-7ab73283e453 {{(pid=68617) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2485.330769] env[68617]: DEBUG oslo_concurrency.lockutils [None req-15dcc037-9cc1-4cda-b37f-f347d20b6967 tempest-ServerShowV247Test-2103361930 tempest-ServerShowV247Test-2103361930-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/c87eab51-bc9a-44dc-8f0d-7ab73283e453/c87eab51-bc9a-44dc-8f0d-7ab73283e453.vmdk" {{(pid=68617) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2488.651291] env[68617]: DEBUG oslo_service.periodic_task [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=68617) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2488.651726] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Getting list of instances from cluster (obj){ [ 2488.651726] env[68617]: value = "domain-c8" [ 2488.651726] env[68617]: _type = "ClusterComputeResource" [ 2488.651726] env[68617]: } {{(pid=68617) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 2488.652862] env[68617]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76849594-7439-41cc-a2f4-27e4aecd36cc {{(pid=68617) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2488.666755] env[68617]: DEBUG nova.virt.vmwareapi.vmops [None req-f299aed3-1d2f-4297-9f84-1195d791f7a0 None None] Got total of 6 instances {{(pid=68617) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}}